Going forward, AI algorithms will be incorporated into more and more everyday applications. For example, you might want to include an image classifier in a smart phone app. To do this, you'd use a deep learning model trained on hundreds of thousands of images as part of the overall application architecture. A large part of software development in the future will be using these types of models as common parts of applications.
In this project, you'll train an image classifier to recognize different species of flowers. You can imagine using something like this in a phone app that tells you the name of the flower your camera is looking at. In practice you'd train this classifier, then export it for use in your application. We'll be using this dataset from Oxford of 102 flower categories, you can see a few examples below.

The project is broken down into multiple steps:
We'll lead you through each part which you'll implement in Python.
When you've completed this project, you'll have an application that can be trained on any set of labeled images. Here your network will be learning about flowers and end up as a command line application. But, what you do with your new skills depends on your imagination and effort in building a dataset. For example, imagine an app where you take a picture of a car, it tells you what the make and model is, then looks up information about it. Go build your own dataset and make something new.
# Import TensorFlow
import tensorflow as tf
import tensorflow_datasets as tfds
import tensorflow_hub as hub
tfds.disable_progress_bar()
# Make all other necessary imports.
import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
import warnings
warnings.filterwarnings('ignore')
%matplotlib inline
%config InlineBackend.figure_format = 'retina'
import numpy as np
import matplotlib.pyplot as plt
import logging
logger = tf.get_logger()
logger.setLevel(logging.ERROR)
import json
print('Using:')
print('\t\u2022 TensorFlow version:', tf.__version__)
print('\t\u2022 tf.keras version:', tf.keras.__version__)
Using: • TensorFlow version: 2.6.0 • tf.keras version: 2.6.0
Here you'll use tensorflow_datasets to load the Oxford Flowers 102 dataset. This dataset has 3 splits: 'train', 'test', and 'validation'. You'll also need to make sure the training data is normalized and resized to 224x224 pixels as required by the pre-trained networks.
The validation and testing sets are used to measure the model's performance on data it hasn't seen yet, but you'll still need to normalize and resize the images to the appropriate size.
# Download data to default local directory "~/tensorflow_datasets"
#!python -m tensorflow_datasets.scripts.download_and_prepare --register_checksums=True --datasets=oxford_flowers102
# Load the dataset with TensorFlow Datasets. Hint: use tfds.load()
dataset, dataset_info = tfds.load('oxford_flowers102', as_supervised = True, with_info = True)
dataset_info
tfds.core.DatasetInfo(
name='oxford_flowers102',
version=2.1.1,
description='The Oxford Flowers 102 dataset is a consistent of 102 flower categories commonly occurring
in the United Kingdom. Each class consists of between 40 and 258 images. The images have
large scale, pose and light variations. In addition, there are categories that have large
variations within the category and several very similar categories.
The dataset is divided into a training set, a validation set and a test set.
The training set and validation set each consist of 10 images per class (totalling 1020 images each).
The test set consists of the remaining 6149 images (minimum 20 per class).',
homepage='https://www.robots.ox.ac.uk/~vgg/data/flowers/102/',
features=FeaturesDict({
'file_name': Text(shape=(), dtype=tf.string),
'image': Image(shape=(None, None, 3), dtype=tf.uint8),
'label': ClassLabel(shape=(), dtype=tf.int64, num_classes=102),
}),
total_num_examples=8189,
splits={
'test': 6149,
'train': 1020,
'validation': 1020,
},
supervised_keys=('image', 'label'),
citation="""@InProceedings{Nilsback08,
author = "Nilsback, M-E. and Zisserman, A.",
title = "Automated Flower Classification over a Large Number of Classes",
booktitle = "Proceedings of the Indian Conference on Computer Vision, Graphics and Image Processing",
year = "2008",
month = "Dec"
}""",
redistribution_info=,
)
# TODO: Create a training set, a validation set and a test set.
# Since the size of the test set is larger than the training, I have mapped the test_set to be used for model training and the
# training_set be used for evaluation
training_set = dataset['test']
validation_set = dataset['validation']
test_set = dataset['train']
# Get the number of examples in each set from the dataset info.
num_training = dataset_info.splits['test'].num_examples
num_validation = dataset_info.splits['validation'].num_examples
num_test = dataset_info.splits['train'].num_examples
print('Number of training examples: ', num_training)
print('Number of validation examples: ', num_validation)
print('Number of testing examples: ', num_test)
# Get the number of classes in the dataset from the dataset info.
num_classes = dataset_info.features['label'].num_classes
print('Number of Classes: ', num_classes)
Number of training examples: 6149 Number of validation examples: 1020 Number of testing examples: 1020 Number of Classes: 102
# Print the shape and corresponding label of 3 images in the training set.
i = 0
for image, label in training_set.take(3):
print('Shape of Image ', i + 1, '::')
print(image.numpy().shape)
print('Label: ', label.numpy())
print('\n')
i += 1
Shape of Image 1 :: (542, 500, 3) Label: 40 Shape of Image 2 :: (748, 500, 3) Label: 76 Shape of Image 3 :: (500, 600, 3) Label: 42
# Plot 1 image from the training set. Set the title of the plot to the corresponding image label.
for image, label in training_set.take(1):
image = image.numpy().squeeze()
label = label.numpy()
plt.imshow(image)
plt.colorbar()
plt.title(label)
plt.show()
You'll also need to load in a mapping from label to category name. You can find this in the file label_map.json. It's a JSON object which you can read in with the json module. This will give you a dictionary mapping the integer coded labels to the actual names of the flowers.
with open('label_map.json', 'r') as f:
class_names = json.load(f)
class_names
{'1': 'pink primrose',
'10': 'globe thistle',
'100': 'blanket flower',
'101': 'trumpet creeper',
'102': 'blackberry lily',
'11': 'snapdragon',
'12': "colt's foot",
'13': 'king protea',
'14': 'spear thistle',
'15': 'yellow iris',
'16': 'globe-flower',
'17': 'purple coneflower',
'18': 'peruvian lily',
'19': 'balloon flower',
'2': 'hard-leaved pocket orchid',
'20': 'giant white arum lily',
'21': 'fire lily',
'22': 'pincushion flower',
'23': 'fritillary',
'24': 'red ginger',
'25': 'grape hyacinth',
'26': 'corn poppy',
'27': 'prince of wales feathers',
'28': 'stemless gentian',
'29': 'artichoke',
'3': 'canterbury bells',
'30': 'sweet william',
'31': 'carnation',
'32': 'garden phlox',
'33': 'love in the mist',
'34': 'mexican aster',
'35': 'alpine sea holly',
'36': 'ruby-lipped cattleya',
'37': 'cape flower',
'38': 'great masterwort',
'39': 'siam tulip',
'4': 'sweet pea',
'40': 'lenten rose',
'41': 'barbeton daisy',
'42': 'daffodil',
'43': 'sword lily',
'44': 'poinsettia',
'45': 'bolero deep blue',
'46': 'wallflower',
'47': 'marigold',
'48': 'buttercup',
'49': 'oxeye daisy',
'5': 'english marigold',
'50': 'common dandelion',
'51': 'petunia',
'52': 'wild pansy',
'53': 'primula',
'54': 'sunflower',
'55': 'pelargonium',
'56': 'bishop of llandaff',
'57': 'gaura',
'58': 'geranium',
'59': 'orange dahlia',
'6': 'tiger lily',
'60': 'pink-yellow dahlia',
'61': 'cautleya spicata',
'62': 'japanese anemone',
'63': 'black-eyed susan',
'64': 'silverbush',
'65': 'californian poppy',
'66': 'osteospermum',
'67': 'spring crocus',
'68': 'bearded iris',
'69': 'windflower',
'7': 'moon orchid',
'70': 'tree poppy',
'71': 'gazania',
'72': 'azalea',
'73': 'water lily',
'74': 'rose',
'75': 'thorn apple',
'76': 'morning glory',
'77': 'passion flower',
'78': 'lotus lotus',
'79': 'toad lily',
'8': 'bird of paradise',
'80': 'anthurium',
'81': 'frangipani',
'82': 'clematis',
'83': 'hibiscus',
'84': 'columbine',
'85': 'desert-rose',
'86': 'tree mallow',
'87': 'magnolia',
'88': 'cyclamen',
'89': 'watercress',
'9': 'monkshood',
'90': 'canna lily',
'91': 'hippeastrum',
'92': 'bee balm',
'93': 'ball moss',
'94': 'foxglove',
'95': 'bougainvillea',
'96': 'camellia',
'97': 'mallow',
'98': 'mexican petunia',
'99': 'bromelia'}
def image_normalize(image, label):
image = tf.cast(image, tf.float32)
image /= 255
image = tf.image.resize(image, [224, 224])
return image, label
training_set = training_set.map(image_normalize)
validation_set = validation_set.map(image_normalize)
test_set = test_set.map(image_normalize)
training_set, validation_set, test_set
(<MapDataset shapes: ((224, 224, 3), ()), types: (tf.float32, tf.int64)>, <MapDataset shapes: ((224, 224, 3), ()), types: (tf.float32, tf.int64)>, <MapDataset shapes: ((224, 224, 3), ()), types: (tf.float32, tf.int64)>)
# Plot 1 image from the training set. Set the title of the plot to the
# corresponding class name.
for image, label in training_set.take(1):
image = image.numpy().squeeze()
label = label.numpy()
plt.imshow(image)
plt.colorbar()
plt.title(class_names[str(label)])
plt.show()
# TODO: Create a pipeline for each set.
batch_size = 51
training_batches = training_set.cache().shuffle(num_training // 4).batch(batch_size).prefetch(1)
validation_batches = validation_set.cache().batch(batch_size).prefetch(1)
testing_batches = test_set.cache().batch(batch_size).prefetch(1)
Now that the data is ready, it's time to build and train the classifier. You should use the MobileNet pre-trained model from TensorFlow Hub to get the image features. Build and train a new feed-forward classifier using those features.
We're going to leave this part up to you. If you want to talk through it with someone, chat with your fellow students!
Refer to the rubric for guidance on successfully completing this section. Things you'll need to do:
We've left a cell open for you below, but use as many as you need. Our advice is to break the problem up into smaller parts you can run separately. Check that each part is doing what you expect, then move on to the next. You'll likely find that as you work through each part, you'll need to go back and modify your previous code. This is totally normal!
When training make sure you're updating only the weights of the feed-forward network. You should be able to get the validation accuracy above 70% if you build everything right.
Note for Workspace users: One important tip if you're using the workspace to run your code: To avoid having your workspace disconnect during the long-running tasks in this notebook, please read in the earlier page in this lesson called Intro to GPU Workspaces about Keeping Your Session Active. You'll want to include code from the workspace_utils.py module. Also, If your model is over 1 GB when saved as a checkpoint, there might be issues with saving backups in your workspace. If your saved checkpoint is larger than 1 GB (you can open a terminal and check with ls -lh), you should reduce the size of your hidden layers and train again.
# Build the ANN model.
tf.keras.backend.clear_session()
ann_classifier = tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape = (224, 224, 3)),
tf.keras.layers.Dense(1024, activation = 'relu'),
tf.keras.layers.Dropout(0.3),
tf.keras.layers.Dense(512, activation = 'relu'),
tf.keras.layers.Dropout(0.3),
tf.keras.layers.Dense(256, activation = 'relu'),
tf.keras.layers.Dropout(0.3),
tf.keras.layers.Dense(128, activation = 'relu'),
tf.keras.layers.Dropout(0.3),
tf.keras.layers.Dense(102, activation = 'softmax')
])
ann_classifier.compile(optimizer = 'adam',
loss = 'sparse_categorical_crossentropy',
metrics = ['accuracy'])
ann_classifier.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= flatten (Flatten) (None, 150528) 0 _________________________________________________________________ dense (Dense) (None, 1024) 154141696 _________________________________________________________________ dropout (Dropout) (None, 1024) 0 _________________________________________________________________ dense_1 (Dense) (None, 512) 524800 _________________________________________________________________ dropout_1 (Dropout) (None, 512) 0 _________________________________________________________________ dense_2 (Dense) (None, 256) 131328 _________________________________________________________________ dropout_2 (Dropout) (None, 256) 0 _________________________________________________________________ dense_3 (Dense) (None, 128) 32896 _________________________________________________________________ dropout_3 (Dropout) (None, 128) 0 _________________________________________________________________ dense_4 (Dense) (None, 102) 13158 ================================================================= Total params: 154,843,878 Trainable params: 154,843,878 Non-trainable params: 0 _________________________________________________________________
# Evaluate the accuracy and loss of untrained model
for image_batch, label_batch in training_batches.take(1):
loss, accuracy = ann_classifier.evaluate(image_batch, label_batch)
print('Loss of the untrained model: ', loss * 100)
print('Accuracy of the untrained model: ', accuracy * 100)
2/2 [==============================] - 1s 180ms/step - loss: 4.5725 - accuracy: 0.0196 Loss of the untrained model: 457.24620819091797 Accuracy of the untrained model: 1.9607843831181526
# Train the model
num_epochs = 30
early_stopping = tf.keras.callbacks.EarlyStopping(monitor = 'val_loss', patience = 5)
history = ann_classifier.fit(training_batches,
epochs = num_epochs,
validation_data = validation_batches,
callbacks = [early_stopping])
Epoch 1/30 121/121 [==============================] - 141s 1s/step - loss: 8.9434 - accuracy: 0.0159 - val_loss: 4.6238 - val_accuracy: 0.0069 Epoch 2/30 121/121 [==============================] - 120s 992ms/step - loss: 4.5659 - accuracy: 0.0288 - val_loss: 4.6192 - val_accuracy: 0.0088 Epoch 3/30 121/121 [==============================] - 121s 997ms/step - loss: 4.5198 - accuracy: 0.0356 - val_loss: 4.6463 - val_accuracy: 0.0098 Epoch 4/30 121/121 [==============================] - 121s 999ms/step - loss: 4.4760 - accuracy: 0.0407 - val_loss: 4.6815 - val_accuracy: 0.0118 Epoch 5/30 121/121 [==============================] - 119s 988ms/step - loss: 4.4449 - accuracy: 0.0392 - val_loss: 4.7596 - val_accuracy: 0.0098 Epoch 6/30 121/121 [==============================] - 120s 992ms/step - loss: 4.4361 - accuracy: 0.0394 - val_loss: 4.7697 - val_accuracy: 0.0078 Epoch 7/30 121/121 [==============================] - 120s 996ms/step - loss: 4.4326 - accuracy: 0.0400 - val_loss: 4.7735 - val_accuracy: 0.0098
# extract the training accuracy,loss & validation accuracy, loss to be plotted.
train_acc, train_loss, val_accuracy, val_loss = history.history['accuracy'], history.history['loss'], history.history['val_accuracy'], history.history['val_loss']
# Plot the validation loss with train loss and similar with accuracy for this ANN model
epochs_range = range(len(train_acc))
plt.figure(figsize = [10, 5])
plt.subplot(1, 2, 1)
plt.plot(epochs_range, train_acc, color = 'red', label = 'Training Accuracy')
plt.plot(epochs_range, val_accuracy, color = 'blue', label = 'Validation Accuracy')
plt.legend(loc = 'middle right')
plt.title('Training Accuracy vs Validation Accuracy for ANN')
plt.subplot(1, 2, 2)
plt.plot(epochs_range, train_loss, color = 'orange', label = 'Training Loss')
plt.plot(epochs_range, val_loss, color = 'green', label = 'Validation Loss')
plt.legend(loc = 'upper right')
plt.title('Training Loss vs Validation Loss for ANN')
plt.show()
# Build the CNN model
tf.keras.backend.clear_session()
cnn_classifier = tf.keras.models.Sequential([
# Image preprocessing layers
tf.keras.layers.RandomFlip('vertical', input_shape = (224, 224, 3)),
tf.keras.layers.RandomFlip('horizontal'),
tf.keras.layers.RandomRotation(factor = 0.2, fill_mode = 'nearest'),
tf.keras.layers.RandomZoom(0.3),
tf.keras.layers.RandomContrast(0.3),
# Convolutional layers
tf.keras.layers.Conv2D(filters = 32, kernel_size = (3, 3), activation = 'relu'),
tf.keras.layers.MaxPool2D(),
tf.keras.layers.Conv2D(filters = 64, kernel_size = (3, 3), activation = 'relu'),
tf.keras.layers.MaxPool2D(),
tf.keras.layers.Conv2D(filters = 128, kernel_size = (3, 3), activation = 'relu'),
tf.keras.layers.MaxPool2D(),
tf.keras.layers.Conv2D(filters = 256, kernel_size = (3, 3), activation = 'relu'),
tf.keras.layers.MaxPool2D(),
# Dense layers
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(1024, activation = 'relu'),
tf.keras.layers.Dropout(0.3),
tf.keras.layers.Dense(512, activation = 'relu'),
tf.keras.layers.Dropout(0.3),
tf.keras.layers.Dense(256, activation = 'relu'),
tf.keras.layers.Dropout(0.3),
tf.keras.layers.Dense(128, activation = 'relu'),
tf.keras.layers.Dropout(0.3),
tf.keras.layers.Dense(102, activation = 'softmax')
])
cnn_classifier.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= random_flip (RandomFlip) (None, 224, 224, 3) 0 _________________________________________________________________ random_flip_1 (RandomFlip) (None, 224, 224, 3) 0 _________________________________________________________________ random_rotation (RandomRotat (None, 224, 224, 3) 0 _________________________________________________________________ random_zoom (RandomZoom) (None, 224, 224, 3) 0 _________________________________________________________________ random_contrast (RandomContr (None, 224, 224, 3) 0 _________________________________________________________________ conv2d (Conv2D) (None, 222, 222, 32) 896 _________________________________________________________________ max_pooling2d (MaxPooling2D) (None, 111, 111, 32) 0 _________________________________________________________________ conv2d_1 (Conv2D) (None, 109, 109, 64) 18496 _________________________________________________________________ max_pooling2d_1 (MaxPooling2 (None, 54, 54, 64) 0 _________________________________________________________________ conv2d_2 (Conv2D) (None, 52, 52, 128) 73856 _________________________________________________________________ max_pooling2d_2 (MaxPooling2 (None, 26, 26, 128) 0 _________________________________________________________________ conv2d_3 (Conv2D) (None, 24, 24, 256) 295168 _________________________________________________________________ max_pooling2d_3 (MaxPooling2 (None, 12, 12, 256) 0 _________________________________________________________________ flatten (Flatten) (None, 36864) 0 _________________________________________________________________ dense (Dense) (None, 1024) 37749760 _________________________________________________________________ dropout (Dropout) (None, 1024) 0 _________________________________________________________________ dense_1 (Dense) (None, 512) 524800 _________________________________________________________________ dropout_1 (Dropout) (None, 512) 0 _________________________________________________________________ dense_2 (Dense) (None, 256) 131328 _________________________________________________________________ dropout_2 (Dropout) (None, 256) 0 _________________________________________________________________ dense_3 (Dense) (None, 128) 32896 _________________________________________________________________ dropout_3 (Dropout) (None, 128) 0 _________________________________________________________________ dense_4 (Dense) (None, 102) 13158 ================================================================= Total params: 38,840,358 Trainable params: 38,840,358 Non-trainable params: 0 _________________________________________________________________
# Compile the CNN model
cnn_classifier.compile(optimizer = 'adam',
loss = 'sparse_categorical_crossentropy',
metrics = ['accuracy'])
# Evaluate the accuracy and loss of untrained model
for image_batch, label_batch in training_batches.take(1):
loss, accuracy = cnn_classifier.evaluate(image_batch, label_batch)
print('Loss of the untrained model: ', loss * 100)
print('Accuracy of the untrained model: ', accuracy * 100)
2/2 [==============================] - 30s 513ms/step - loss: 4.6239 - accuracy: 0.0000e+00 Loss of the untrained model: 462.39452362060547 Accuracy of the untrained model: 0.0
# Train the CNN classifier
num_epochs = 500
history = cnn_classifier.fit_generator(training_batches,
epochs = num_epochs,
validation_data = validation_batches)
Epoch 1/500 121/121 [==============================] - 44s 280ms/step - loss: 4.5200 - accuracy: 0.0342 - val_loss: 4.7504 - val_accuracy: 0.0098 Epoch 2/500 121/121 [==============================] - 24s 197ms/step - loss: 4.3717 - accuracy: 0.0504 - val_loss: 4.5646 - val_accuracy: 0.0235 Epoch 3/500 121/121 [==============================] - 24s 196ms/step - loss: 4.1376 - accuracy: 0.0603 - val_loss: 4.3524 - val_accuracy: 0.0206 Epoch 4/500 121/121 [==============================] - 24s 196ms/step - loss: 3.9827 - accuracy: 0.0722 - val_loss: 4.3236 - val_accuracy: 0.0353 Epoch 5/500 121/121 [==============================] - 24s 196ms/step - loss: 3.8633 - accuracy: 0.0846 - val_loss: 4.0489 - val_accuracy: 0.0373 Epoch 6/500 121/121 [==============================] - 24s 195ms/step - loss: 3.7663 - accuracy: 0.0968 - val_loss: 3.9858 - val_accuracy: 0.0490 Epoch 7/500 121/121 [==============================] - 24s 195ms/step - loss: 3.6620 - accuracy: 0.1051 - val_loss: 3.8737 - val_accuracy: 0.0520 Epoch 8/500 121/121 [==============================] - 24s 195ms/step - loss: 3.5519 - accuracy: 0.1273 - val_loss: 3.6892 - val_accuracy: 0.0637 Epoch 9/500 121/121 [==============================] - 24s 196ms/step - loss: 3.3956 - accuracy: 0.1459 - val_loss: 3.5272 - val_accuracy: 0.0873 Epoch 10/500 121/121 [==============================] - 24s 195ms/step - loss: 3.2874 - accuracy: 0.1711 - val_loss: 3.4993 - val_accuracy: 0.1069 Epoch 11/500 121/121 [==============================] - 24s 196ms/step - loss: 3.2194 - accuracy: 0.1714 - val_loss: 3.3595 - val_accuracy: 0.1137 Epoch 12/500 121/121 [==============================] - 24s 196ms/step - loss: 3.1033 - accuracy: 0.1950 - val_loss: 3.3040 - val_accuracy: 0.1255 Epoch 13/500 121/121 [==============================] - 24s 195ms/step - loss: 3.0393 - accuracy: 0.2127 - val_loss: 3.1982 - val_accuracy: 0.1598 Epoch 14/500 121/121 [==============================] - 24s 195ms/step - loss: 2.9429 - accuracy: 0.2313 - val_loss: 3.0743 - val_accuracy: 0.1804 Epoch 15/500 121/121 [==============================] - 24s 195ms/step - loss: 2.9096 - accuracy: 0.2402 - val_loss: 3.0478 - val_accuracy: 0.1971 Epoch 16/500 121/121 [==============================] - 24s 195ms/step - loss: 2.8317 - accuracy: 0.2530 - val_loss: 3.0882 - val_accuracy: 0.1833 Epoch 17/500 121/121 [==============================] - 24s 195ms/step - loss: 2.8015 - accuracy: 0.2695 - val_loss: 3.0168 - val_accuracy: 0.1951 Epoch 18/500 121/121 [==============================] - 24s 195ms/step - loss: 2.7151 - accuracy: 0.2781 - val_loss: 2.8480 - val_accuracy: 0.2186 Epoch 19/500 121/121 [==============================] - 24s 195ms/step - loss: 2.6762 - accuracy: 0.2895 - val_loss: 2.8183 - val_accuracy: 0.2520 Epoch 20/500 121/121 [==============================] - 24s 195ms/step - loss: 2.6535 - accuracy: 0.2965 - val_loss: 2.9689 - val_accuracy: 0.2314 Epoch 21/500 121/121 [==============================] - 24s 196ms/step - loss: 2.6138 - accuracy: 0.3030 - val_loss: 2.8284 - val_accuracy: 0.2608 Epoch 22/500 121/121 [==============================] - 24s 195ms/step - loss: 2.5684 - accuracy: 0.3090 - val_loss: 2.7939 - val_accuracy: 0.2500 Epoch 23/500 121/121 [==============================] - 24s 195ms/step - loss: 2.5011 - accuracy: 0.3295 - val_loss: 2.8066 - val_accuracy: 0.2686 Epoch 24/500 121/121 [==============================] - 24s 195ms/step - loss: 2.4606 - accuracy: 0.3373 - val_loss: 2.6235 - val_accuracy: 0.2686 Epoch 25/500 121/121 [==============================] - 24s 195ms/step - loss: 2.4501 - accuracy: 0.3409 - val_loss: 2.7069 - val_accuracy: 0.2971 Epoch 26/500 121/121 [==============================] - 24s 194ms/step - loss: 2.4029 - accuracy: 0.3459 - val_loss: 2.5743 - val_accuracy: 0.3059 Epoch 27/500 121/121 [==============================] - 24s 195ms/step - loss: 2.3792 - accuracy: 0.3588 - val_loss: 2.5077 - val_accuracy: 0.3118 Epoch 28/500 121/121 [==============================] - 24s 195ms/step - loss: 2.3464 - accuracy: 0.3597 - val_loss: 2.5183 - val_accuracy: 0.3098 Epoch 29/500 121/121 [==============================] - 24s 195ms/step - loss: 2.3290 - accuracy: 0.3765 - val_loss: 2.5754 - val_accuracy: 0.3216 Epoch 30/500 121/121 [==============================] - 24s 195ms/step - loss: 2.2962 - accuracy: 0.3726 - val_loss: 2.5069 - val_accuracy: 0.3284 Epoch 31/500 121/121 [==============================] - 24s 195ms/step - loss: 2.2152 - accuracy: 0.3919 - val_loss: 2.3981 - val_accuracy: 0.3324 Epoch 32/500 121/121 [==============================] - 24s 195ms/step - loss: 2.2195 - accuracy: 0.3960 - val_loss: 2.4835 - val_accuracy: 0.3196 Epoch 33/500 121/121 [==============================] - 24s 195ms/step - loss: 2.1755 - accuracy: 0.4007 - val_loss: 2.3365 - val_accuracy: 0.3598 Epoch 34/500 121/121 [==============================] - 24s 195ms/step - loss: 2.1437 - accuracy: 0.4126 - val_loss: 2.4582 - val_accuracy: 0.3578 Epoch 35/500 121/121 [==============================] - 24s 194ms/step - loss: 2.1388 - accuracy: 0.4092 - val_loss: 2.3000 - val_accuracy: 0.3559 Epoch 36/500 121/121 [==============================] - 24s 196ms/step - loss: 2.0900 - accuracy: 0.4271 - val_loss: 2.3042 - val_accuracy: 0.3676 Epoch 37/500 121/121 [==============================] - 24s 195ms/step - loss: 2.1071 - accuracy: 0.4279 - val_loss: 2.2474 - val_accuracy: 0.3676 Epoch 38/500 121/121 [==============================] - 24s 195ms/step - loss: 2.0760 - accuracy: 0.4293 - val_loss: 2.2484 - val_accuracy: 0.3735 Epoch 39/500 121/121 [==============================] - 24s 196ms/step - loss: 2.0408 - accuracy: 0.4406 - val_loss: 2.1632 - val_accuracy: 0.4069 Epoch 40/500 121/121 [==============================] - 24s 195ms/step - loss: 2.0112 - accuracy: 0.4420 - val_loss: 2.1990 - val_accuracy: 0.3902 Epoch 41/500 121/121 [==============================] - 24s 196ms/step - loss: 1.9880 - accuracy: 0.4568 - val_loss: 2.1943 - val_accuracy: 0.3843 Epoch 42/500 121/121 [==============================] - 24s 196ms/step - loss: 1.9937 - accuracy: 0.4547 - val_loss: 2.1821 - val_accuracy: 0.4137 Epoch 43/500 121/121 [==============================] - 24s 195ms/step - loss: 1.9226 - accuracy: 0.4656 - val_loss: 2.2574 - val_accuracy: 0.4000 Epoch 44/500 121/121 [==============================] - 24s 195ms/step - loss: 1.9621 - accuracy: 0.4667 - val_loss: 2.1557 - val_accuracy: 0.4284 Epoch 45/500 121/121 [==============================] - 24s 196ms/step - loss: 1.9050 - accuracy: 0.4713 - val_loss: 2.1395 - val_accuracy: 0.4324 Epoch 46/500 121/121 [==============================] - 24s 195ms/step - loss: 1.9359 - accuracy: 0.4710 - val_loss: 2.0672 - val_accuracy: 0.4324 Epoch 47/500 121/121 [==============================] - 24s 196ms/step - loss: 1.9270 - accuracy: 0.4624 - val_loss: 2.0277 - val_accuracy: 0.4461 Epoch 48/500 121/121 [==============================] - 24s 195ms/step - loss: 1.8816 - accuracy: 0.4811 - val_loss: 2.0154 - val_accuracy: 0.4549 Epoch 49/500 121/121 [==============================] - 24s 196ms/step - loss: 1.8060 - accuracy: 0.4968 - val_loss: 2.1018 - val_accuracy: 0.4412 Epoch 50/500 121/121 [==============================] - 24s 196ms/step - loss: 1.8274 - accuracy: 0.4897 - val_loss: 2.0776 - val_accuracy: 0.4529 Epoch 51/500 121/121 [==============================] - 24s 195ms/step - loss: 1.8136 - accuracy: 0.4981 - val_loss: 2.1436 - val_accuracy: 0.4461 Epoch 52/500 121/121 [==============================] - 24s 196ms/step - loss: 1.7872 - accuracy: 0.4986 - val_loss: 2.0241 - val_accuracy: 0.4471 Epoch 53/500 121/121 [==============================] - 24s 196ms/step - loss: 1.7811 - accuracy: 0.5043 - val_loss: 2.0063 - val_accuracy: 0.4647 Epoch 54/500 121/121 [==============================] - 24s 196ms/step - loss: 1.7915 - accuracy: 0.5063 - val_loss: 2.0036 - val_accuracy: 0.4647 Epoch 55/500 121/121 [==============================] - 24s 196ms/step - loss: 1.7592 - accuracy: 0.5124 - val_loss: 2.0986 - val_accuracy: 0.4461 Epoch 56/500 121/121 [==============================] - 24s 196ms/step - loss: 1.7402 - accuracy: 0.5134 - val_loss: 2.1383 - val_accuracy: 0.4529 Epoch 57/500 121/121 [==============================] - 24s 196ms/step - loss: 1.7277 - accuracy: 0.5173 - val_loss: 1.9404 - val_accuracy: 0.4608 Epoch 58/500 121/121 [==============================] - 24s 196ms/step - loss: 1.6907 - accuracy: 0.5287 - val_loss: 1.9506 - val_accuracy: 0.4686 Epoch 59/500 121/121 [==============================] - 24s 197ms/step - loss: 1.6767 - accuracy: 0.5240 - val_loss: 2.0508 - val_accuracy: 0.4873 Epoch 60/500 121/121 [==============================] - 24s 196ms/step - loss: 1.6447 - accuracy: 0.5376 - val_loss: 2.0940 - val_accuracy: 0.4598 Epoch 61/500 121/121 [==============================] - 24s 196ms/step - loss: 1.6617 - accuracy: 0.5409 - val_loss: 1.9706 - val_accuracy: 0.4784 Epoch 62/500 121/121 [==============================] - 24s 196ms/step - loss: 1.6744 - accuracy: 0.5443 - val_loss: 1.9280 - val_accuracy: 0.4892 Epoch 63/500 121/121 [==============================] - 24s 197ms/step - loss: 1.6407 - accuracy: 0.5442 - val_loss: 2.0519 - val_accuracy: 0.4745 Epoch 64/500 121/121 [==============================] - 24s 196ms/step - loss: 1.6272 - accuracy: 0.5500 - val_loss: 1.8625 - val_accuracy: 0.5000 Epoch 65/500 121/121 [==============================] - 24s 196ms/step - loss: 1.6398 - accuracy: 0.5526 - val_loss: 1.9907 - val_accuracy: 0.4725 Epoch 66/500 121/121 [==============================] - 24s 197ms/step - loss: 1.6032 - accuracy: 0.5503 - val_loss: 1.9125 - val_accuracy: 0.5157 Epoch 67/500 121/121 [==============================] - 24s 196ms/step - loss: 1.5757 - accuracy: 0.5596 - val_loss: 1.9706 - val_accuracy: 0.4961 Epoch 68/500 121/121 [==============================] - 24s 196ms/step - loss: 1.6027 - accuracy: 0.5586 - val_loss: 1.8562 - val_accuracy: 0.5147 Epoch 69/500 121/121 [==============================] - 24s 195ms/step - loss: 1.5914 - accuracy: 0.5578 - val_loss: 2.0364 - val_accuracy: 0.4912 Epoch 70/500 121/121 [==============================] - 24s 196ms/step - loss: 1.5861 - accuracy: 0.5606 - val_loss: 1.9196 - val_accuracy: 0.5147 Epoch 71/500 121/121 [==============================] - 24s 195ms/step - loss: 1.5464 - accuracy: 0.5624 - val_loss: 1.8755 - val_accuracy: 0.4990 Epoch 72/500 121/121 [==============================] - 24s 196ms/step - loss: 1.5265 - accuracy: 0.5780 - val_loss: 2.0922 - val_accuracy: 0.4980 Epoch 73/500 121/121 [==============================] - 24s 196ms/step - loss: 1.5150 - accuracy: 0.5770 - val_loss: 1.8931 - val_accuracy: 0.5176 Epoch 74/500 121/121 [==============================] - 24s 196ms/step - loss: 1.5316 - accuracy: 0.5739 - val_loss: 1.8775 - val_accuracy: 0.5157 Epoch 75/500 121/121 [==============================] - 24s 195ms/step - loss: 1.5639 - accuracy: 0.5637 - val_loss: 1.8519 - val_accuracy: 0.5284 Epoch 76/500 121/121 [==============================] - 24s 196ms/step - loss: 1.4877 - accuracy: 0.5882 - val_loss: 2.0028 - val_accuracy: 0.5098 Epoch 77/500 121/121 [==============================] - 24s 196ms/step - loss: 1.5533 - accuracy: 0.5728 - val_loss: 1.9218 - val_accuracy: 0.5098 Epoch 78/500 121/121 [==============================] - 24s 196ms/step - loss: 1.4648 - accuracy: 0.5938 - val_loss: 1.9281 - val_accuracy: 0.5167 Epoch 79/500 121/121 [==============================] - 24s 196ms/step - loss: 1.4442 - accuracy: 0.5900 - val_loss: 2.0296 - val_accuracy: 0.5039 Epoch 80/500 121/121 [==============================] - 24s 195ms/step - loss: 1.4778 - accuracy: 0.5939 - val_loss: 1.8164 - val_accuracy: 0.5490 Epoch 81/500 121/121 [==============================] - 24s 196ms/step - loss: 1.4527 - accuracy: 0.5983 - val_loss: 1.9885 - val_accuracy: 0.5284 Epoch 82/500 121/121 [==============================] - 24s 195ms/step - loss: 1.4459 - accuracy: 0.5998 - val_loss: 1.8850 - val_accuracy: 0.5294 Epoch 83/500 121/121 [==============================] - 24s 195ms/step - loss: 1.4128 - accuracy: 0.6084 - val_loss: 1.7993 - val_accuracy: 0.5412 Epoch 84/500 121/121 [==============================] - 24s 196ms/step - loss: 1.4256 - accuracy: 0.6012 - val_loss: 1.6516 - val_accuracy: 0.5775 Epoch 85/500 121/121 [==============================] - 24s 195ms/step - loss: 1.4123 - accuracy: 0.6123 - val_loss: 1.8066 - val_accuracy: 0.5431 Epoch 86/500 121/121 [==============================] - 24s 196ms/step - loss: 1.4192 - accuracy: 0.6056 - val_loss: 1.7563 - val_accuracy: 0.5647 Epoch 87/500 121/121 [==============================] - 24s 195ms/step - loss: 1.4445 - accuracy: 0.6055 - val_loss: 1.9265 - val_accuracy: 0.5275 Epoch 88/500 121/121 [==============================] - 24s 195ms/step - loss: 1.4080 - accuracy: 0.6016 - val_loss: 1.8333 - val_accuracy: 0.5520 Epoch 89/500 121/121 [==============================] - 24s 196ms/step - loss: 1.3647 - accuracy: 0.6144 - val_loss: 2.1342 - val_accuracy: 0.5127 Epoch 90/500 121/121 [==============================] - 24s 195ms/step - loss: 1.4046 - accuracy: 0.6063 - val_loss: 1.7309 - val_accuracy: 0.5588 Epoch 91/500 121/121 [==============================] - 24s 195ms/step - loss: 1.3674 - accuracy: 0.6208 - val_loss: 1.8133 - val_accuracy: 0.5529 Epoch 92/500 121/121 [==============================] - 24s 196ms/step - loss: 1.3606 - accuracy: 0.6219 - val_loss: 1.7807 - val_accuracy: 0.5833 Epoch 93/500 121/121 [==============================] - 24s 196ms/step - loss: 1.3551 - accuracy: 0.6216 - val_loss: 1.6985 - val_accuracy: 0.5833 Epoch 94/500 121/121 [==============================] - 24s 195ms/step - loss: 1.3654 - accuracy: 0.6248 - val_loss: 1.8192 - val_accuracy: 0.5569 Epoch 95/500 121/121 [==============================] - 24s 195ms/step - loss: 1.3190 - accuracy: 0.6277 - val_loss: 1.7390 - val_accuracy: 0.5794 Epoch 96/500 121/121 [==============================] - 24s 196ms/step - loss: 1.3517 - accuracy: 0.6240 - val_loss: 1.7213 - val_accuracy: 0.5804 Epoch 97/500 121/121 [==============================] - 24s 196ms/step - loss: 1.3251 - accuracy: 0.6341 - val_loss: 1.8214 - val_accuracy: 0.5529 Epoch 98/500 121/121 [==============================] - 24s 196ms/step - loss: 1.3152 - accuracy: 0.6386 - val_loss: 1.7350 - val_accuracy: 0.5735 Epoch 99/500 121/121 [==============================] - 24s 195ms/step - loss: 1.3721 - accuracy: 0.6234 - val_loss: 1.7196 - val_accuracy: 0.5853 Epoch 100/500 121/121 [==============================] - 24s 195ms/step - loss: 1.3041 - accuracy: 0.6342 - val_loss: 1.7766 - val_accuracy: 0.5686 Epoch 101/500 121/121 [==============================] - 24s 194ms/step - loss: 1.3107 - accuracy: 0.6388 - val_loss: 1.7936 - val_accuracy: 0.5647 Epoch 102/500 121/121 [==============================] - 24s 196ms/step - loss: 1.2974 - accuracy: 0.6416 - val_loss: 1.7507 - val_accuracy: 0.5833 Epoch 103/500 121/121 [==============================] - 24s 195ms/step - loss: 1.2906 - accuracy: 0.6438 - val_loss: 1.7022 - val_accuracy: 0.5882 Epoch 104/500 121/121 [==============================] - 24s 195ms/step - loss: 1.3192 - accuracy: 0.6346 - val_loss: 1.6859 - val_accuracy: 0.5696 Epoch 105/500 121/121 [==============================] - 24s 195ms/step - loss: 1.2969 - accuracy: 0.6403 - val_loss: 1.7409 - val_accuracy: 0.5863 Epoch 106/500 121/121 [==============================] - 24s 195ms/step - loss: 1.2725 - accuracy: 0.6477 - val_loss: 1.7646 - val_accuracy: 0.5922 Epoch 107/500 121/121 [==============================] - 24s 194ms/step - loss: 1.2838 - accuracy: 0.6409 - val_loss: 1.8111 - val_accuracy: 0.5716 Epoch 108/500 121/121 [==============================] - 23s 194ms/step - loss: 1.2362 - accuracy: 0.6556 - val_loss: 1.7712 - val_accuracy: 0.5863 Epoch 109/500 121/121 [==============================] - 24s 194ms/step - loss: 1.2759 - accuracy: 0.6425 - val_loss: 1.7130 - val_accuracy: 0.5892 Epoch 110/500 121/121 [==============================] - 23s 194ms/step - loss: 1.2672 - accuracy: 0.6562 - val_loss: 1.7055 - val_accuracy: 0.5892 Epoch 111/500 121/121 [==============================] - 24s 194ms/step - loss: 1.2373 - accuracy: 0.6630 - val_loss: 1.7821 - val_accuracy: 0.5765 Epoch 112/500 121/121 [==============================] - 24s 194ms/step - loss: 1.2367 - accuracy: 0.6554 - val_loss: 1.7279 - val_accuracy: 0.5794 Epoch 113/500 121/121 [==============================] - 24s 194ms/step - loss: 1.2703 - accuracy: 0.6508 - val_loss: 1.7949 - val_accuracy: 0.5775 Epoch 114/500 121/121 [==============================] - 24s 195ms/step - loss: 1.2214 - accuracy: 0.6601 - val_loss: 1.6907 - val_accuracy: 0.5971 Epoch 115/500 121/121 [==============================] - 23s 194ms/step - loss: 1.2301 - accuracy: 0.6656 - val_loss: 1.6154 - val_accuracy: 0.5990 Epoch 116/500 121/121 [==============================] - 24s 194ms/step - loss: 1.2318 - accuracy: 0.6582 - val_loss: 1.6946 - val_accuracy: 0.5912 Epoch 117/500 121/121 [==============================] - 24s 195ms/step - loss: 1.2106 - accuracy: 0.6700 - val_loss: 1.7177 - val_accuracy: 0.5922 Epoch 118/500 121/121 [==============================] - 23s 194ms/step - loss: 1.2181 - accuracy: 0.6681 - val_loss: 2.1693 - val_accuracy: 0.5471 Epoch 119/500 121/121 [==============================] - 23s 194ms/step - loss: 1.2391 - accuracy: 0.6681 - val_loss: 1.5833 - val_accuracy: 0.6176 Epoch 120/500 121/121 [==============================] - 24s 195ms/step - loss: 1.2396 - accuracy: 0.6616 - val_loss: 1.6389 - val_accuracy: 0.6137 Epoch 121/500 121/121 [==============================] - 24s 195ms/step - loss: 1.1714 - accuracy: 0.6695 - val_loss: 1.6263 - val_accuracy: 0.6078 Epoch 122/500 121/121 [==============================] - 24s 195ms/step - loss: 1.2014 - accuracy: 0.6731 - val_loss: 1.7080 - val_accuracy: 0.5833 Epoch 123/500 121/121 [==============================] - 23s 194ms/step - loss: 1.1866 - accuracy: 0.6691 - val_loss: 1.5167 - val_accuracy: 0.6216 Epoch 124/500 121/121 [==============================] - 24s 195ms/step - loss: 1.2170 - accuracy: 0.6609 - val_loss: 2.0573 - val_accuracy: 0.5676 Epoch 125/500 121/121 [==============================] - 24s 194ms/step - loss: 1.1714 - accuracy: 0.6788 - val_loss: 1.6037 - val_accuracy: 0.6147 Epoch 126/500 121/121 [==============================] - 24s 194ms/step - loss: 1.1889 - accuracy: 0.6728 - val_loss: 1.8991 - val_accuracy: 0.5794 Epoch 127/500 121/121 [==============================] - 23s 194ms/step - loss: 1.1806 - accuracy: 0.6699 - val_loss: 1.7896 - val_accuracy: 0.5922 Epoch 128/500 121/121 [==============================] - 24s 195ms/step - loss: 1.1715 - accuracy: 0.6897 - val_loss: 2.0524 - val_accuracy: 0.5480 Epoch 129/500 121/121 [==============================] - 24s 195ms/step - loss: 1.1598 - accuracy: 0.6777 - val_loss: 1.8146 - val_accuracy: 0.5853 Epoch 130/500 121/121 [==============================] - 23s 194ms/step - loss: 1.1889 - accuracy: 0.6793 - val_loss: 1.6856 - val_accuracy: 0.6010 Epoch 131/500 121/121 [==============================] - 24s 195ms/step - loss: 1.1207 - accuracy: 0.6910 - val_loss: 1.8414 - val_accuracy: 0.5853 Epoch 132/500 121/121 [==============================] - 23s 194ms/step - loss: 1.1979 - accuracy: 0.6747 - val_loss: 1.7264 - val_accuracy: 0.6059 Epoch 133/500 121/121 [==============================] - 24s 195ms/step - loss: 1.1298 - accuracy: 0.6879 - val_loss: 1.6639 - val_accuracy: 0.6216 Epoch 134/500 121/121 [==============================] - 24s 194ms/step - loss: 1.1570 - accuracy: 0.6936 - val_loss: 1.8106 - val_accuracy: 0.5980 Epoch 135/500 121/121 [==============================] - 24s 194ms/step - loss: 1.1441 - accuracy: 0.6812 - val_loss: 1.7855 - val_accuracy: 0.5892 Epoch 136/500 121/121 [==============================] - 24s 194ms/step - loss: 1.1413 - accuracy: 0.6873 - val_loss: 1.6467 - val_accuracy: 0.6098 Epoch 137/500 121/121 [==============================] - 23s 194ms/step - loss: 1.1106 - accuracy: 0.6967 - val_loss: 1.6845 - val_accuracy: 0.6127 Epoch 138/500 121/121 [==============================] - 23s 194ms/step - loss: 1.1360 - accuracy: 0.6873 - val_loss: 1.5602 - val_accuracy: 0.6206 Epoch 139/500 121/121 [==============================] - 23s 194ms/step - loss: 1.1624 - accuracy: 0.6847 - val_loss: 1.5921 - val_accuracy: 0.6127 Epoch 140/500 121/121 [==============================] - 24s 195ms/step - loss: 1.1454 - accuracy: 0.6863 - val_loss: 1.7002 - val_accuracy: 0.6010 Epoch 141/500 121/121 [==============================] - 24s 194ms/step - loss: 1.1530 - accuracy: 0.6821 - val_loss: 1.6882 - val_accuracy: 0.6167 Epoch 142/500 121/121 [==============================] - 23s 194ms/step - loss: 1.0386 - accuracy: 0.7104 - val_loss: 1.6001 - val_accuracy: 0.6265 Epoch 143/500 121/121 [==============================] - 24s 194ms/step - loss: 1.1439 - accuracy: 0.6852 - val_loss: 1.7603 - val_accuracy: 0.5990 Epoch 144/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0890 - accuracy: 0.6995 - val_loss: 1.8549 - val_accuracy: 0.6137 Epoch 145/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0782 - accuracy: 0.7043 - val_loss: 1.7661 - val_accuracy: 0.5971 Epoch 146/500 121/121 [==============================] - 23s 194ms/step - loss: 1.1266 - accuracy: 0.6915 - val_loss: 1.7684 - val_accuracy: 0.5961 Epoch 147/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0841 - accuracy: 0.6990 - val_loss: 1.6466 - val_accuracy: 0.6265 Epoch 148/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0909 - accuracy: 0.7014 - val_loss: 1.7592 - val_accuracy: 0.6196 Epoch 149/500 121/121 [==============================] - 23s 194ms/step - loss: 1.1060 - accuracy: 0.6972 - val_loss: 1.6706 - val_accuracy: 0.6127 Epoch 150/500 121/121 [==============================] - 24s 195ms/step - loss: 1.1081 - accuracy: 0.7000 - val_loss: 1.5970 - val_accuracy: 0.6225 Epoch 151/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0908 - accuracy: 0.7048 - val_loss: 1.6616 - val_accuracy: 0.6127 Epoch 152/500 121/121 [==============================] - 23s 194ms/step - loss: 1.0991 - accuracy: 0.6956 - val_loss: 1.5565 - val_accuracy: 0.6235 Epoch 153/500 121/121 [==============================] - 24s 195ms/step - loss: 1.1071 - accuracy: 0.7006 - val_loss: 1.7345 - val_accuracy: 0.6314 Epoch 154/500 121/121 [==============================] - 24s 194ms/step - loss: 1.0230 - accuracy: 0.7170 - val_loss: 1.7225 - val_accuracy: 0.6176 Epoch 155/500 121/121 [==============================] - 24s 194ms/step - loss: 1.0648 - accuracy: 0.7082 - val_loss: 1.7125 - val_accuracy: 0.6127 Epoch 156/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0475 - accuracy: 0.7115 - val_loss: 1.7504 - val_accuracy: 0.6167 Epoch 157/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0381 - accuracy: 0.7136 - val_loss: 1.7431 - val_accuracy: 0.6078 Epoch 158/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0408 - accuracy: 0.7073 - val_loss: 1.7051 - val_accuracy: 0.6284 Epoch 159/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0819 - accuracy: 0.7001 - val_loss: 1.7534 - val_accuracy: 0.6216 Epoch 160/500 121/121 [==============================] - 24s 194ms/step - loss: 1.0388 - accuracy: 0.7139 - val_loss: 1.7237 - val_accuracy: 0.6186 Epoch 161/500 121/121 [==============================] - 23s 194ms/step - loss: 1.0394 - accuracy: 0.7164 - val_loss: 1.7553 - val_accuracy: 0.6088 Epoch 162/500 121/121 [==============================] - 23s 194ms/step - loss: 1.1344 - accuracy: 0.6928 - val_loss: 1.8151 - val_accuracy: 0.6108 Epoch 163/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0667 - accuracy: 0.7121 - val_loss: 1.7379 - val_accuracy: 0.6225 Epoch 164/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0706 - accuracy: 0.7066 - val_loss: 1.7091 - val_accuracy: 0.6020 Epoch 165/500 121/121 [==============================] - 23s 194ms/step - loss: 1.0493 - accuracy: 0.7102 - val_loss: 1.8629 - val_accuracy: 0.5931 Epoch 166/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0876 - accuracy: 0.7089 - val_loss: 1.6082 - val_accuracy: 0.6245 Epoch 167/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0213 - accuracy: 0.7222 - val_loss: 1.7760 - val_accuracy: 0.6088 Epoch 168/500 121/121 [==============================] - 23s 194ms/step - loss: 1.0197 - accuracy: 0.7190 - val_loss: 2.0465 - val_accuracy: 0.5863 Epoch 169/500 121/121 [==============================] - 23s 194ms/step - loss: 1.0216 - accuracy: 0.7235 - val_loss: 1.7748 - val_accuracy: 0.6314 Epoch 170/500 121/121 [==============================] - 23s 194ms/step - loss: 1.0146 - accuracy: 0.7219 - val_loss: 1.7716 - val_accuracy: 0.6265 Epoch 171/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0670 - accuracy: 0.7159 - val_loss: 1.6496 - val_accuracy: 0.6118 Epoch 172/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0038 - accuracy: 0.7226 - val_loss: 1.8200 - val_accuracy: 0.6196 Epoch 173/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0327 - accuracy: 0.7213 - val_loss: 1.7131 - val_accuracy: 0.6324 Epoch 174/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9885 - accuracy: 0.7250 - val_loss: 1.5689 - val_accuracy: 0.6480 Epoch 175/500 121/121 [==============================] - 23s 194ms/step - loss: 1.0130 - accuracy: 0.7237 - val_loss: 1.6338 - val_accuracy: 0.6304 Epoch 176/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0535 - accuracy: 0.7222 - val_loss: 1.6485 - val_accuracy: 0.6441 Epoch 177/500 121/121 [==============================] - 24s 195ms/step - loss: 0.9981 - accuracy: 0.7274 - val_loss: 1.6732 - val_accuracy: 0.6304 Epoch 178/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0362 - accuracy: 0.7268 - val_loss: 1.7001 - val_accuracy: 0.6235 Epoch 179/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9854 - accuracy: 0.7269 - val_loss: 1.6841 - val_accuracy: 0.6255 Epoch 180/500 121/121 [==============================] - 24s 195ms/step - loss: 0.9713 - accuracy: 0.7287 - val_loss: 1.5709 - val_accuracy: 0.6402 Epoch 181/500 121/121 [==============================] - 23s 194ms/step - loss: 0.9516 - accuracy: 0.7349 - val_loss: 1.7572 - val_accuracy: 0.6137 Epoch 182/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9874 - accuracy: 0.7312 - val_loss: 1.6620 - val_accuracy: 0.6431 Epoch 183/500 121/121 [==============================] - 24s 196ms/step - loss: 1.0152 - accuracy: 0.7232 - val_loss: 1.6906 - val_accuracy: 0.6471 Epoch 184/500 121/121 [==============================] - 24s 196ms/step - loss: 1.0116 - accuracy: 0.7208 - val_loss: 1.6754 - val_accuracy: 0.6118 Epoch 185/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9557 - accuracy: 0.7400 - val_loss: 1.5797 - val_accuracy: 0.6559 Epoch 186/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9803 - accuracy: 0.7351 - val_loss: 1.5568 - val_accuracy: 0.6422 Epoch 187/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9756 - accuracy: 0.7322 - val_loss: 1.4897 - val_accuracy: 0.6618 Epoch 188/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9857 - accuracy: 0.7310 - val_loss: 1.6873 - val_accuracy: 0.6392 Epoch 189/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9862 - accuracy: 0.7361 - val_loss: 1.5504 - val_accuracy: 0.6520 Epoch 190/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9841 - accuracy: 0.7312 - val_loss: 1.7175 - val_accuracy: 0.6284 Epoch 191/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9866 - accuracy: 0.7343 - val_loss: 1.6402 - val_accuracy: 0.6539 Epoch 192/500 121/121 [==============================] - 24s 196ms/step - loss: 1.0150 - accuracy: 0.7175 - val_loss: 1.7039 - val_accuracy: 0.6304 Epoch 193/500 121/121 [==============================] - 24s 195ms/step - loss: 1.0250 - accuracy: 0.7302 - val_loss: 1.7938 - val_accuracy: 0.6049 Epoch 194/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9244 - accuracy: 0.7484 - val_loss: 1.6098 - val_accuracy: 0.6480 Epoch 195/500 121/121 [==============================] - 24s 195ms/step - loss: 0.9202 - accuracy: 0.7499 - val_loss: 1.6893 - val_accuracy: 0.6431 Epoch 196/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9259 - accuracy: 0.7437 - val_loss: 1.8326 - val_accuracy: 0.6275 Epoch 197/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9713 - accuracy: 0.7313 - val_loss: 1.7304 - val_accuracy: 0.6196 Epoch 198/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9512 - accuracy: 0.7456 - val_loss: 1.6532 - val_accuracy: 0.6402 Epoch 199/500 121/121 [==============================] - 24s 196ms/step - loss: 1.0068 - accuracy: 0.7341 - val_loss: 1.5562 - val_accuracy: 0.6382 Epoch 200/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9141 - accuracy: 0.7513 - val_loss: 1.7169 - val_accuracy: 0.6431 Epoch 201/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9694 - accuracy: 0.7361 - val_loss: 1.7737 - val_accuracy: 0.6363 Epoch 202/500 121/121 [==============================] - 24s 197ms/step - loss: 1.0024 - accuracy: 0.7401 - val_loss: 1.5961 - val_accuracy: 0.6402 Epoch 203/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9025 - accuracy: 0.7557 - val_loss: 1.5845 - val_accuracy: 0.6618 Epoch 204/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9331 - accuracy: 0.7499 - val_loss: 1.6791 - val_accuracy: 0.6353 Epoch 205/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9455 - accuracy: 0.7390 - val_loss: 1.7798 - val_accuracy: 0.6324 Epoch 206/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9412 - accuracy: 0.7473 - val_loss: 1.4952 - val_accuracy: 0.6549 Epoch 207/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9813 - accuracy: 0.7346 - val_loss: 1.5832 - val_accuracy: 0.6745 Epoch 208/500 121/121 [==============================] - 24s 198ms/step - loss: 0.9686 - accuracy: 0.7362 - val_loss: 1.6349 - val_accuracy: 0.6402 Epoch 209/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9325 - accuracy: 0.7458 - val_loss: 1.5378 - val_accuracy: 0.6637 Epoch 210/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8617 - accuracy: 0.7585 - val_loss: 1.6669 - val_accuracy: 0.6618 Epoch 211/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9297 - accuracy: 0.7494 - val_loss: 1.7221 - val_accuracy: 0.6324 Epoch 212/500 121/121 [==============================] - 24s 199ms/step - loss: 0.9310 - accuracy: 0.7491 - val_loss: 1.7175 - val_accuracy: 0.6265 Epoch 213/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9283 - accuracy: 0.7505 - val_loss: 1.7731 - val_accuracy: 0.6373 Epoch 214/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9418 - accuracy: 0.7453 - val_loss: 1.7857 - val_accuracy: 0.6353 Epoch 215/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9830 - accuracy: 0.7439 - val_loss: 1.5926 - val_accuracy: 0.6461 Epoch 216/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9749 - accuracy: 0.7307 - val_loss: 1.6825 - val_accuracy: 0.6382 Epoch 217/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8912 - accuracy: 0.7548 - val_loss: 1.6081 - val_accuracy: 0.6392 Epoch 218/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9300 - accuracy: 0.7522 - val_loss: 1.5471 - val_accuracy: 0.6667 Epoch 219/500 121/121 [==============================] - 24s 198ms/step - loss: 0.9246 - accuracy: 0.7531 - val_loss: 1.7098 - val_accuracy: 0.6353 Epoch 220/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8653 - accuracy: 0.7660 - val_loss: 1.6428 - val_accuracy: 0.6637 Epoch 221/500 121/121 [==============================] - 24s 198ms/step - loss: 0.9444 - accuracy: 0.7484 - val_loss: 1.7000 - val_accuracy: 0.6333 Epoch 222/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9155 - accuracy: 0.7544 - val_loss: 1.6627 - val_accuracy: 0.6441 Epoch 223/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9044 - accuracy: 0.7554 - val_loss: 1.6800 - val_accuracy: 0.6451 Epoch 224/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9250 - accuracy: 0.7538 - val_loss: 1.5743 - val_accuracy: 0.6490 Epoch 225/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8799 - accuracy: 0.7601 - val_loss: 1.7399 - val_accuracy: 0.6598 Epoch 226/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9120 - accuracy: 0.7565 - val_loss: 1.6964 - val_accuracy: 0.6490 Epoch 227/500 121/121 [==============================] - 24s 196ms/step - loss: 0.8947 - accuracy: 0.7564 - val_loss: 1.6911 - val_accuracy: 0.6559 Epoch 228/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9259 - accuracy: 0.7471 - val_loss: 1.8424 - val_accuracy: 0.6324 Epoch 229/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9075 - accuracy: 0.7552 - val_loss: 1.7809 - val_accuracy: 0.6314 Epoch 230/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8650 - accuracy: 0.7644 - val_loss: 1.6998 - val_accuracy: 0.6549 Epoch 231/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8977 - accuracy: 0.7559 - val_loss: 1.7438 - val_accuracy: 0.6480 Epoch 232/500 121/121 [==============================] - 24s 196ms/step - loss: 0.8793 - accuracy: 0.7658 - val_loss: 1.6050 - val_accuracy: 0.6775 Epoch 233/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9325 - accuracy: 0.7513 - val_loss: 1.5659 - val_accuracy: 0.6784 Epoch 234/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8704 - accuracy: 0.7583 - val_loss: 1.6618 - val_accuracy: 0.6480 Epoch 235/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8823 - accuracy: 0.7590 - val_loss: 1.5969 - val_accuracy: 0.6725 Epoch 236/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8581 - accuracy: 0.7712 - val_loss: 1.6525 - val_accuracy: 0.6529 Epoch 237/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8576 - accuracy: 0.7635 - val_loss: 1.5068 - val_accuracy: 0.6627 Epoch 238/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9390 - accuracy: 0.7489 - val_loss: 1.6161 - val_accuracy: 0.6529 Epoch 239/500 121/121 [==============================] - 24s 198ms/step - loss: 0.9346 - accuracy: 0.7549 - val_loss: 1.6791 - val_accuracy: 0.6578 Epoch 240/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9074 - accuracy: 0.7606 - val_loss: 1.5061 - val_accuracy: 0.6775 Epoch 241/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8605 - accuracy: 0.7673 - val_loss: 1.5122 - val_accuracy: 0.6912 Epoch 242/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9122 - accuracy: 0.7583 - val_loss: 1.6076 - val_accuracy: 0.6618 Epoch 243/500 121/121 [==============================] - 24s 198ms/step - loss: 0.9178 - accuracy: 0.7575 - val_loss: 1.8408 - val_accuracy: 0.6373 Epoch 244/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8738 - accuracy: 0.7668 - val_loss: 1.5907 - val_accuracy: 0.6735 Epoch 245/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9015 - accuracy: 0.7585 - val_loss: 1.6200 - val_accuracy: 0.6412 Epoch 246/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8813 - accuracy: 0.7552 - val_loss: 1.6074 - val_accuracy: 0.6627 Epoch 247/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8711 - accuracy: 0.7660 - val_loss: 1.6998 - val_accuracy: 0.6598 Epoch 248/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8578 - accuracy: 0.7702 - val_loss: 1.6916 - val_accuracy: 0.6598 Epoch 249/500 121/121 [==============================] - 24s 197ms/step - loss: 0.9508 - accuracy: 0.7497 - val_loss: 1.6429 - val_accuracy: 0.6725 Epoch 250/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8648 - accuracy: 0.7647 - val_loss: 1.7262 - val_accuracy: 0.6490 Epoch 251/500 121/121 [==============================] - 24s 196ms/step - loss: 0.9092 - accuracy: 0.7583 - val_loss: 1.4915 - val_accuracy: 0.6706 Epoch 252/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8653 - accuracy: 0.7608 - val_loss: 1.6655 - val_accuracy: 0.6745 Epoch 253/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8567 - accuracy: 0.7700 - val_loss: 1.6547 - val_accuracy: 0.6814 Epoch 254/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8977 - accuracy: 0.7590 - val_loss: 1.6047 - val_accuracy: 0.6716 Epoch 255/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8972 - accuracy: 0.7650 - val_loss: 1.7641 - val_accuracy: 0.6618 Epoch 256/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8808 - accuracy: 0.7681 - val_loss: 1.7375 - val_accuracy: 0.6549 Epoch 257/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8890 - accuracy: 0.7626 - val_loss: 1.6833 - val_accuracy: 0.6608 Epoch 258/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8731 - accuracy: 0.7686 - val_loss: 1.5840 - val_accuracy: 0.6716 Epoch 259/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8633 - accuracy: 0.7647 - val_loss: 1.5869 - val_accuracy: 0.6647 Epoch 260/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8679 - accuracy: 0.7683 - val_loss: 1.7437 - val_accuracy: 0.6461 Epoch 261/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8342 - accuracy: 0.7718 - val_loss: 1.6035 - val_accuracy: 0.6657 Epoch 262/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8034 - accuracy: 0.7814 - val_loss: 1.6319 - val_accuracy: 0.6696 Epoch 263/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8828 - accuracy: 0.7686 - val_loss: 1.6005 - val_accuracy: 0.6676 Epoch 264/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8998 - accuracy: 0.7556 - val_loss: 1.5952 - val_accuracy: 0.6569 Epoch 265/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8365 - accuracy: 0.7756 - val_loss: 1.6507 - val_accuracy: 0.6765 Epoch 266/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8437 - accuracy: 0.7725 - val_loss: 1.6827 - val_accuracy: 0.6431 Epoch 267/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8331 - accuracy: 0.7752 - val_loss: 1.6187 - val_accuracy: 0.6539 Epoch 268/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8535 - accuracy: 0.7722 - val_loss: 1.5244 - val_accuracy: 0.6637 Epoch 269/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8041 - accuracy: 0.7806 - val_loss: 1.8501 - val_accuracy: 0.6431 Epoch 270/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8576 - accuracy: 0.7761 - val_loss: 1.6479 - val_accuracy: 0.6627 Epoch 271/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8606 - accuracy: 0.7728 - val_loss: 1.9710 - val_accuracy: 0.6118 Epoch 272/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8714 - accuracy: 0.7650 - val_loss: 1.6772 - val_accuracy: 0.6667 Epoch 273/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7995 - accuracy: 0.7848 - val_loss: 1.6140 - val_accuracy: 0.6559 Epoch 274/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8093 - accuracy: 0.7806 - val_loss: 1.6738 - val_accuracy: 0.6529 Epoch 275/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8247 - accuracy: 0.7819 - val_loss: 1.6819 - val_accuracy: 0.6755 Epoch 276/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8601 - accuracy: 0.7744 - val_loss: 1.5552 - val_accuracy: 0.6735 Epoch 277/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8707 - accuracy: 0.7759 - val_loss: 1.7588 - val_accuracy: 0.6402 Epoch 278/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8861 - accuracy: 0.7629 - val_loss: 1.5976 - val_accuracy: 0.6667 Epoch 279/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8609 - accuracy: 0.7722 - val_loss: 1.5220 - val_accuracy: 0.6745 Epoch 280/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8088 - accuracy: 0.7819 - val_loss: 1.6852 - val_accuracy: 0.6539 Epoch 281/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8201 - accuracy: 0.7878 - val_loss: 1.5999 - val_accuracy: 0.6725 Epoch 282/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8404 - accuracy: 0.7782 - val_loss: 1.7232 - val_accuracy: 0.6333 Epoch 283/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8619 - accuracy: 0.7756 - val_loss: 1.5178 - val_accuracy: 0.6784 Epoch 284/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8323 - accuracy: 0.7827 - val_loss: 1.5308 - val_accuracy: 0.6804 Epoch 285/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8153 - accuracy: 0.7801 - val_loss: 1.6298 - val_accuracy: 0.6627 Epoch 286/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8052 - accuracy: 0.7852 - val_loss: 1.5588 - val_accuracy: 0.6775 Epoch 287/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8458 - accuracy: 0.7814 - val_loss: 1.6380 - val_accuracy: 0.6529 Epoch 288/500 121/121 [==============================] - 24s 198ms/step - loss: 0.9061 - accuracy: 0.7674 - val_loss: 1.5753 - val_accuracy: 0.6627 Epoch 289/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8510 - accuracy: 0.7770 - val_loss: 1.5656 - val_accuracy: 0.6500 Epoch 290/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7816 - accuracy: 0.7842 - val_loss: 1.6710 - val_accuracy: 0.6804 Epoch 291/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8139 - accuracy: 0.7852 - val_loss: 1.7186 - val_accuracy: 0.6588 Epoch 292/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8186 - accuracy: 0.7837 - val_loss: 1.5125 - val_accuracy: 0.6794 Epoch 293/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7753 - accuracy: 0.7910 - val_loss: 1.5528 - val_accuracy: 0.6833 Epoch 294/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7910 - accuracy: 0.7907 - val_loss: 1.5652 - val_accuracy: 0.6637 Epoch 295/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8340 - accuracy: 0.7775 - val_loss: 1.9399 - val_accuracy: 0.6422 Epoch 296/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8067 - accuracy: 0.7873 - val_loss: 1.7516 - val_accuracy: 0.6529 Epoch 297/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8238 - accuracy: 0.7870 - val_loss: 1.4968 - val_accuracy: 0.6873 Epoch 298/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8604 - accuracy: 0.7814 - val_loss: 1.5268 - val_accuracy: 0.6735 Epoch 299/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8360 - accuracy: 0.7835 - val_loss: 1.6598 - val_accuracy: 0.6588 Epoch 300/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8166 - accuracy: 0.7847 - val_loss: 1.6997 - val_accuracy: 0.6735 Epoch 301/500 121/121 [==============================] - 24s 199ms/step - loss: 0.8085 - accuracy: 0.7944 - val_loss: 1.6202 - val_accuracy: 0.6627 Epoch 302/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8229 - accuracy: 0.7894 - val_loss: 1.4837 - val_accuracy: 0.6922 Epoch 303/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7517 - accuracy: 0.8013 - val_loss: 1.6614 - val_accuracy: 0.6686 Epoch 304/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7709 - accuracy: 0.7918 - val_loss: 1.7189 - val_accuracy: 0.6637 Epoch 305/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8157 - accuracy: 0.7842 - val_loss: 1.4901 - val_accuracy: 0.6873 Epoch 306/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7486 - accuracy: 0.7975 - val_loss: 1.6048 - val_accuracy: 0.6696 Epoch 307/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7634 - accuracy: 0.7979 - val_loss: 1.6808 - val_accuracy: 0.6667 Epoch 308/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8102 - accuracy: 0.7852 - val_loss: 1.6091 - val_accuracy: 0.6716 Epoch 309/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8501 - accuracy: 0.7744 - val_loss: 1.6963 - val_accuracy: 0.6676 Epoch 310/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8042 - accuracy: 0.7873 - val_loss: 1.6198 - val_accuracy: 0.6755 Epoch 311/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8136 - accuracy: 0.7894 - val_loss: 1.5409 - val_accuracy: 0.6784 Epoch 312/500 121/121 [==============================] - 24s 199ms/step - loss: 0.8453 - accuracy: 0.7814 - val_loss: 1.8324 - val_accuracy: 0.6402 Epoch 313/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7947 - accuracy: 0.7948 - val_loss: 1.6940 - val_accuracy: 0.6598 Epoch 314/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8349 - accuracy: 0.7845 - val_loss: 1.7753 - val_accuracy: 0.6314 Epoch 315/500 121/121 [==============================] - 24s 199ms/step - loss: 0.8137 - accuracy: 0.7887 - val_loss: 1.5029 - val_accuracy: 0.6804 Epoch 316/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8243 - accuracy: 0.7874 - val_loss: 1.6883 - val_accuracy: 0.6696 Epoch 317/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7925 - accuracy: 0.7933 - val_loss: 1.6709 - val_accuracy: 0.6588 Epoch 318/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7407 - accuracy: 0.8000 - val_loss: 1.5247 - val_accuracy: 0.6814 Epoch 319/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7890 - accuracy: 0.7848 - val_loss: 1.6929 - val_accuracy: 0.6686 Epoch 320/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7859 - accuracy: 0.7900 - val_loss: 1.8134 - val_accuracy: 0.6618 Epoch 321/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7041 - accuracy: 0.8115 - val_loss: 1.6549 - val_accuracy: 0.6843 Epoch 322/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7700 - accuracy: 0.7987 - val_loss: 1.8650 - val_accuracy: 0.6490 Epoch 323/500 121/121 [==============================] - 24s 196ms/step - loss: 0.8161 - accuracy: 0.7881 - val_loss: 1.5998 - val_accuracy: 0.6863 Epoch 324/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7749 - accuracy: 0.8003 - val_loss: 1.7594 - val_accuracy: 0.6696 Epoch 325/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7979 - accuracy: 0.7964 - val_loss: 1.5309 - val_accuracy: 0.6735 Epoch 326/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7740 - accuracy: 0.7949 - val_loss: 1.5884 - val_accuracy: 0.6814 Epoch 327/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7435 - accuracy: 0.8040 - val_loss: 1.6157 - val_accuracy: 0.6784 Epoch 328/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7299 - accuracy: 0.7995 - val_loss: 1.6525 - val_accuracy: 0.6863 Epoch 329/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8424 - accuracy: 0.7798 - val_loss: 1.5455 - val_accuracy: 0.6824 Epoch 330/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7851 - accuracy: 0.7962 - val_loss: 1.6670 - val_accuracy: 0.6657 Epoch 331/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8112 - accuracy: 0.7852 - val_loss: 1.6784 - val_accuracy: 0.6824 Epoch 332/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8470 - accuracy: 0.7850 - val_loss: 1.5668 - val_accuracy: 0.7000 Epoch 333/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7753 - accuracy: 0.8009 - val_loss: 1.6532 - val_accuracy: 0.6804 Epoch 334/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7995 - accuracy: 0.7852 - val_loss: 1.4561 - val_accuracy: 0.6971 Epoch 335/500 121/121 [==============================] - 24s 196ms/step - loss: 0.8469 - accuracy: 0.7889 - val_loss: 1.4652 - val_accuracy: 0.6902 Epoch 336/500 121/121 [==============================] - 24s 198ms/step - loss: 0.8051 - accuracy: 0.7962 - val_loss: 1.4930 - val_accuracy: 0.6814 Epoch 337/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7647 - accuracy: 0.7996 - val_loss: 1.5565 - val_accuracy: 0.6804 Epoch 338/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7559 - accuracy: 0.8057 - val_loss: 1.6081 - val_accuracy: 0.6706 Epoch 339/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8063 - accuracy: 0.7923 - val_loss: 1.6415 - val_accuracy: 0.6578 Epoch 340/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7694 - accuracy: 0.7977 - val_loss: 1.9141 - val_accuracy: 0.6716 Epoch 341/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7883 - accuracy: 0.7987 - val_loss: 1.4175 - val_accuracy: 0.6931 Epoch 342/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8372 - accuracy: 0.7845 - val_loss: 1.5247 - val_accuracy: 0.6853 Epoch 343/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7816 - accuracy: 0.7943 - val_loss: 1.5190 - val_accuracy: 0.6833 Epoch 344/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7613 - accuracy: 0.7966 - val_loss: 1.6594 - val_accuracy: 0.6833 Epoch 345/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7635 - accuracy: 0.8034 - val_loss: 1.7964 - val_accuracy: 0.6657 Epoch 346/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7473 - accuracy: 0.8047 - val_loss: 1.6642 - val_accuracy: 0.6873 Epoch 347/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7724 - accuracy: 0.8018 - val_loss: 1.5040 - val_accuracy: 0.6853 Epoch 348/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7412 - accuracy: 0.8120 - val_loss: 1.5948 - val_accuracy: 0.6775 Epoch 349/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7175 - accuracy: 0.8058 - val_loss: 1.6713 - val_accuracy: 0.6922 Epoch 350/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7608 - accuracy: 0.8013 - val_loss: 1.6448 - val_accuracy: 0.6775 Epoch 351/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8312 - accuracy: 0.7861 - val_loss: 1.5835 - val_accuracy: 0.6696 Epoch 352/500 121/121 [==============================] - 24s 196ms/step - loss: 0.8132 - accuracy: 0.7940 - val_loss: 1.6088 - val_accuracy: 0.6824 Epoch 353/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7963 - accuracy: 0.7925 - val_loss: 1.5507 - val_accuracy: 0.6794 Epoch 354/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7929 - accuracy: 0.7977 - val_loss: 1.7019 - val_accuracy: 0.6735 Epoch 355/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7697 - accuracy: 0.8005 - val_loss: 1.4675 - val_accuracy: 0.7000 Epoch 356/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7648 - accuracy: 0.7996 - val_loss: 1.6655 - val_accuracy: 0.6931 Epoch 357/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7685 - accuracy: 0.8034 - val_loss: 1.5809 - val_accuracy: 0.6676 Epoch 358/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7440 - accuracy: 0.8078 - val_loss: 1.8325 - val_accuracy: 0.6833 Epoch 359/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7194 - accuracy: 0.8034 - val_loss: 1.5111 - val_accuracy: 0.6794 Epoch 360/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7971 - accuracy: 0.7918 - val_loss: 1.6792 - val_accuracy: 0.6775 Epoch 361/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7929 - accuracy: 0.7970 - val_loss: 1.4522 - val_accuracy: 0.6990 Epoch 362/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7753 - accuracy: 0.7988 - val_loss: 1.6758 - val_accuracy: 0.6775 Epoch 363/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7401 - accuracy: 0.8057 - val_loss: 1.6439 - val_accuracy: 0.6745 Epoch 364/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7160 - accuracy: 0.8101 - val_loss: 1.5863 - val_accuracy: 0.6941 Epoch 365/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7397 - accuracy: 0.8114 - val_loss: 1.6460 - val_accuracy: 0.6951 Epoch 366/500 121/121 [==============================] - 24s 197ms/step - loss: 0.6857 - accuracy: 0.8195 - val_loss: 1.8693 - val_accuracy: 0.6794 Epoch 367/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7744 - accuracy: 0.8037 - val_loss: 1.8122 - val_accuracy: 0.6627 Epoch 368/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7279 - accuracy: 0.8136 - val_loss: 1.6899 - val_accuracy: 0.6696 Epoch 369/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7372 - accuracy: 0.8048 - val_loss: 1.9024 - val_accuracy: 0.6608 Epoch 370/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7696 - accuracy: 0.8066 - val_loss: 1.6136 - val_accuracy: 0.6873 Epoch 371/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7358 - accuracy: 0.8057 - val_loss: 1.6311 - val_accuracy: 0.6833 Epoch 372/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7646 - accuracy: 0.8052 - val_loss: 1.9025 - val_accuracy: 0.6353 Epoch 373/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7935 - accuracy: 0.7954 - val_loss: 1.6755 - val_accuracy: 0.6892 Epoch 374/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7662 - accuracy: 0.8026 - val_loss: 1.7695 - val_accuracy: 0.6735 Epoch 375/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7618 - accuracy: 0.8021 - val_loss: 1.7264 - val_accuracy: 0.6931 Epoch 376/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7449 - accuracy: 0.8097 - val_loss: 1.5460 - val_accuracy: 0.6931 Epoch 377/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7189 - accuracy: 0.8112 - val_loss: 1.8924 - val_accuracy: 0.6549 Epoch 378/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7693 - accuracy: 0.8055 - val_loss: 1.8155 - val_accuracy: 0.6853 Epoch 379/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8027 - accuracy: 0.8009 - val_loss: 2.0829 - val_accuracy: 0.6500 Epoch 380/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7795 - accuracy: 0.8013 - val_loss: 1.6319 - val_accuracy: 0.6971 Epoch 381/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7232 - accuracy: 0.8084 - val_loss: 1.4935 - val_accuracy: 0.7088 Epoch 382/500 121/121 [==============================] - 24s 196ms/step - loss: 0.6680 - accuracy: 0.8245 - val_loss: 1.7089 - val_accuracy: 0.6853 Epoch 383/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7059 - accuracy: 0.8180 - val_loss: 1.7921 - val_accuracy: 0.6833 Epoch 384/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7764 - accuracy: 0.8003 - val_loss: 1.8089 - val_accuracy: 0.6794 Epoch 385/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7781 - accuracy: 0.7987 - val_loss: 1.7936 - val_accuracy: 0.6775 Epoch 386/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7406 - accuracy: 0.8074 - val_loss: 1.7521 - val_accuracy: 0.6892 Epoch 387/500 121/121 [==============================] - 24s 197ms/step - loss: 0.6974 - accuracy: 0.8237 - val_loss: 1.7205 - val_accuracy: 0.6922 Epoch 388/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7608 - accuracy: 0.8032 - val_loss: 1.8813 - val_accuracy: 0.6637 Epoch 389/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7488 - accuracy: 0.8048 - val_loss: 1.6210 - val_accuracy: 0.6794 Epoch 390/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7337 - accuracy: 0.8096 - val_loss: 2.0012 - val_accuracy: 0.6549 Epoch 391/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7529 - accuracy: 0.7992 - val_loss: 1.6911 - val_accuracy: 0.6892 Epoch 392/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7425 - accuracy: 0.8117 - val_loss: 1.6796 - val_accuracy: 0.6863 Epoch 393/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7280 - accuracy: 0.8105 - val_loss: 2.1045 - val_accuracy: 0.6451 Epoch 394/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7081 - accuracy: 0.8154 - val_loss: 1.9031 - val_accuracy: 0.6627 Epoch 395/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7219 - accuracy: 0.8159 - val_loss: 1.6421 - val_accuracy: 0.6961 Epoch 396/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7329 - accuracy: 0.8151 - val_loss: 1.6891 - val_accuracy: 0.6912 Epoch 397/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7652 - accuracy: 0.8039 - val_loss: 1.9822 - val_accuracy: 0.6588 Epoch 398/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7448 - accuracy: 0.8055 - val_loss: 1.6894 - val_accuracy: 0.6833 Epoch 399/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7059 - accuracy: 0.8179 - val_loss: 1.8493 - val_accuracy: 0.6784 Epoch 400/500 121/121 [==============================] - 24s 197ms/step - loss: 0.6583 - accuracy: 0.8255 - val_loss: 1.6994 - val_accuracy: 0.7059 Epoch 401/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7002 - accuracy: 0.8208 - val_loss: 1.5260 - val_accuracy: 0.7078 Epoch 402/500 121/121 [==============================] - 24s 196ms/step - loss: 0.6725 - accuracy: 0.8222 - val_loss: 1.7477 - val_accuracy: 0.7000 Epoch 403/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7213 - accuracy: 0.8162 - val_loss: 1.8218 - val_accuracy: 0.6941 Epoch 404/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7267 - accuracy: 0.8164 - val_loss: 1.7251 - val_accuracy: 0.6951 Epoch 405/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7721 - accuracy: 0.8070 - val_loss: 1.7941 - val_accuracy: 0.6716 Epoch 406/500 121/121 [==============================] - 24s 197ms/step - loss: 0.8215 - accuracy: 0.8019 - val_loss: 1.8615 - val_accuracy: 0.6696 Epoch 407/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7469 - accuracy: 0.8065 - val_loss: 1.8581 - val_accuracy: 0.6765 Epoch 408/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7601 - accuracy: 0.8089 - val_loss: 1.7020 - val_accuracy: 0.6922 Epoch 409/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7371 - accuracy: 0.8112 - val_loss: 1.5918 - val_accuracy: 0.6882 Epoch 410/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7805 - accuracy: 0.8068 - val_loss: 1.5723 - val_accuracy: 0.6882 Epoch 411/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7273 - accuracy: 0.8135 - val_loss: 1.8110 - val_accuracy: 0.6794 Epoch 412/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7225 - accuracy: 0.8144 - val_loss: 1.9013 - val_accuracy: 0.6696 Epoch 413/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7891 - accuracy: 0.8074 - val_loss: 1.7263 - val_accuracy: 0.6745 Epoch 414/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6933 - accuracy: 0.8201 - val_loss: 1.6984 - val_accuracy: 0.6794 Epoch 415/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6956 - accuracy: 0.8164 - val_loss: 1.7598 - val_accuracy: 0.6686 Epoch 416/500 121/121 [==============================] - 24s 197ms/step - loss: 0.6938 - accuracy: 0.8222 - val_loss: 1.7326 - val_accuracy: 0.6951 Epoch 417/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7161 - accuracy: 0.8193 - val_loss: 1.7513 - val_accuracy: 0.6971 Epoch 418/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7225 - accuracy: 0.8102 - val_loss: 1.6587 - val_accuracy: 0.6882 Epoch 419/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7241 - accuracy: 0.8187 - val_loss: 2.0177 - val_accuracy: 0.6500 Epoch 420/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7489 - accuracy: 0.8148 - val_loss: 1.7093 - val_accuracy: 0.6951 Epoch 421/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7460 - accuracy: 0.8131 - val_loss: 1.7680 - val_accuracy: 0.6824 Epoch 422/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7373 - accuracy: 0.8151 - val_loss: 1.8105 - val_accuracy: 0.6608 Epoch 423/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7320 - accuracy: 0.8114 - val_loss: 1.7723 - val_accuracy: 0.6745 Epoch 424/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7667 - accuracy: 0.8058 - val_loss: 1.7622 - val_accuracy: 0.6833 Epoch 425/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6854 - accuracy: 0.8175 - val_loss: 1.6325 - val_accuracy: 0.7020 Epoch 426/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7061 - accuracy: 0.8157 - val_loss: 1.9936 - val_accuracy: 0.6824 Epoch 427/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7510 - accuracy: 0.8122 - val_loss: 1.5538 - val_accuracy: 0.7020 Epoch 428/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7348 - accuracy: 0.8089 - val_loss: 1.5838 - val_accuracy: 0.7098 Epoch 429/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7248 - accuracy: 0.8128 - val_loss: 1.8277 - val_accuracy: 0.6755 Epoch 430/500 121/121 [==============================] - 24s 197ms/step - loss: 0.6505 - accuracy: 0.8289 - val_loss: 1.7245 - val_accuracy: 0.6873 Epoch 431/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7368 - accuracy: 0.8196 - val_loss: 1.7307 - val_accuracy: 0.6843 Epoch 432/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7308 - accuracy: 0.8114 - val_loss: 1.7392 - val_accuracy: 0.6804 Epoch 433/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7091 - accuracy: 0.8185 - val_loss: 1.9018 - val_accuracy: 0.6667 Epoch 434/500 121/121 [==============================] - 24s 197ms/step - loss: 0.6871 - accuracy: 0.8234 - val_loss: 1.6190 - val_accuracy: 0.7069 Epoch 435/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7136 - accuracy: 0.8206 - val_loss: 1.8488 - val_accuracy: 0.6941 Epoch 436/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7328 - accuracy: 0.8185 - val_loss: 1.6751 - val_accuracy: 0.6892 Epoch 437/500 121/121 [==============================] - 24s 196ms/step - loss: 0.6955 - accuracy: 0.8242 - val_loss: 1.7017 - val_accuracy: 0.6833 Epoch 438/500 121/121 [==============================] - 24s 197ms/step - loss: 0.6704 - accuracy: 0.8283 - val_loss: 1.5578 - val_accuracy: 0.7176 Epoch 439/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7272 - accuracy: 0.8151 - val_loss: 1.7792 - val_accuracy: 0.6951 Epoch 440/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7099 - accuracy: 0.8179 - val_loss: 1.6787 - val_accuracy: 0.7049 Epoch 441/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7323 - accuracy: 0.8195 - val_loss: 1.8647 - val_accuracy: 0.6882 Epoch 442/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7337 - accuracy: 0.8166 - val_loss: 1.8279 - val_accuracy: 0.7127 Epoch 443/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7439 - accuracy: 0.8161 - val_loss: 1.8740 - val_accuracy: 0.6804 Epoch 444/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7291 - accuracy: 0.8159 - val_loss: 1.6715 - val_accuracy: 0.7000 Epoch 445/500 121/121 [==============================] - 24s 197ms/step - loss: 0.6656 - accuracy: 0.8307 - val_loss: 2.0109 - val_accuracy: 0.6765 Epoch 446/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7028 - accuracy: 0.8297 - val_loss: 1.5442 - val_accuracy: 0.7176 Epoch 447/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7434 - accuracy: 0.8159 - val_loss: 1.7237 - val_accuracy: 0.6853 Epoch 448/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7656 - accuracy: 0.8115 - val_loss: 1.7264 - val_accuracy: 0.6833 Epoch 449/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7614 - accuracy: 0.8107 - val_loss: 1.6561 - val_accuracy: 0.6765 Epoch 450/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7285 - accuracy: 0.8123 - val_loss: 1.6845 - val_accuracy: 0.6990 Epoch 451/500 121/121 [==============================] - 24s 196ms/step - loss: 0.7812 - accuracy: 0.8101 - val_loss: 1.5569 - val_accuracy: 0.6990 Epoch 452/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7390 - accuracy: 0.8180 - val_loss: 1.7361 - val_accuracy: 0.6902 Epoch 453/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7517 - accuracy: 0.8175 - val_loss: 1.8045 - val_accuracy: 0.6667 Epoch 454/500 121/121 [==============================] - 24s 197ms/step - loss: 0.6702 - accuracy: 0.8317 - val_loss: 1.8724 - val_accuracy: 0.6873 Epoch 455/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6890 - accuracy: 0.8206 - val_loss: 1.6137 - val_accuracy: 0.7059 Epoch 456/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6960 - accuracy: 0.8218 - val_loss: 1.8363 - val_accuracy: 0.6814 Epoch 457/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7296 - accuracy: 0.8130 - val_loss: 1.6642 - val_accuracy: 0.6853 Epoch 458/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7114 - accuracy: 0.8161 - val_loss: 1.6892 - val_accuracy: 0.6765 Epoch 459/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6812 - accuracy: 0.8294 - val_loss: 1.6006 - val_accuracy: 0.7098 Epoch 460/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7549 - accuracy: 0.8130 - val_loss: 1.7648 - val_accuracy: 0.6745 Epoch 461/500 121/121 [==============================] - 24s 199ms/step - loss: 0.6848 - accuracy: 0.8245 - val_loss: 1.6413 - val_accuracy: 0.6990 Epoch 462/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7139 - accuracy: 0.8273 - val_loss: 1.8192 - val_accuracy: 0.6735 Epoch 463/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6986 - accuracy: 0.8283 - val_loss: 1.6046 - val_accuracy: 0.7029 Epoch 464/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6983 - accuracy: 0.8214 - val_loss: 1.5193 - val_accuracy: 0.7039 Epoch 465/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6904 - accuracy: 0.8209 - val_loss: 1.8203 - val_accuracy: 0.6824 Epoch 466/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7027 - accuracy: 0.8257 - val_loss: 1.9786 - val_accuracy: 0.6922 Epoch 467/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7249 - accuracy: 0.8170 - val_loss: 1.7810 - val_accuracy: 0.6735 Epoch 468/500 121/121 [==============================] - 24s 200ms/step - loss: 0.7272 - accuracy: 0.8245 - val_loss: 1.5304 - val_accuracy: 0.7265 Epoch 469/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6522 - accuracy: 0.8318 - val_loss: 1.7817 - val_accuracy: 0.6912 Epoch 470/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6829 - accuracy: 0.8273 - val_loss: 1.8569 - val_accuracy: 0.6775 Epoch 471/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6817 - accuracy: 0.8309 - val_loss: 1.5511 - val_accuracy: 0.6990 Epoch 472/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7068 - accuracy: 0.8135 - val_loss: 1.8298 - val_accuracy: 0.6686 Epoch 473/500 121/121 [==============================] - 24s 197ms/step - loss: 0.6808 - accuracy: 0.8257 - val_loss: 1.5604 - val_accuracy: 0.6990 Epoch 474/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6727 - accuracy: 0.8312 - val_loss: 1.5750 - val_accuracy: 0.7029 Epoch 475/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7194 - accuracy: 0.8170 - val_loss: 1.6689 - val_accuracy: 0.7088 Epoch 476/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7159 - accuracy: 0.8182 - val_loss: 1.4588 - val_accuracy: 0.7108 Epoch 477/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7166 - accuracy: 0.8180 - val_loss: 1.6586 - val_accuracy: 0.6833 Epoch 478/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6921 - accuracy: 0.8305 - val_loss: 1.9028 - val_accuracy: 0.6676 Epoch 479/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7595 - accuracy: 0.8198 - val_loss: 1.8763 - val_accuracy: 0.6765 Epoch 480/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6778 - accuracy: 0.8253 - val_loss: 1.7467 - val_accuracy: 0.6794 Epoch 481/500 121/121 [==============================] - 24s 199ms/step - loss: 0.6890 - accuracy: 0.8250 - val_loss: 1.7631 - val_accuracy: 0.6990 Epoch 482/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7133 - accuracy: 0.8177 - val_loss: 1.7027 - val_accuracy: 0.6882 Epoch 483/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7324 - accuracy: 0.8143 - val_loss: 1.6596 - val_accuracy: 0.6892 Epoch 484/500 121/121 [==============================] - 24s 199ms/step - loss: 0.6833 - accuracy: 0.8273 - val_loss: 1.6971 - val_accuracy: 0.6941 Epoch 485/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7350 - accuracy: 0.8120 - val_loss: 1.8105 - val_accuracy: 0.6882 Epoch 486/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7908 - accuracy: 0.8026 - val_loss: 1.6795 - val_accuracy: 0.6765 Epoch 487/500 121/121 [==============================] - 24s 197ms/step - loss: 0.6504 - accuracy: 0.8364 - val_loss: 1.7191 - val_accuracy: 0.6863 Epoch 488/500 121/121 [==============================] - 24s 197ms/step - loss: 0.7517 - accuracy: 0.8146 - val_loss: 1.5445 - val_accuracy: 0.6804 Epoch 489/500 121/121 [==============================] - 24s 199ms/step - loss: 0.6629 - accuracy: 0.8301 - val_loss: 1.8160 - val_accuracy: 0.6686 Epoch 490/500 121/121 [==============================] - 24s 199ms/step - loss: 0.6770 - accuracy: 0.8317 - val_loss: 1.5777 - val_accuracy: 0.6892 Epoch 491/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6812 - accuracy: 0.8235 - val_loss: 1.6882 - val_accuracy: 0.6941 Epoch 492/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6342 - accuracy: 0.8333 - val_loss: 1.5435 - val_accuracy: 0.7049 Epoch 493/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6030 - accuracy: 0.8362 - val_loss: 1.7237 - val_accuracy: 0.6922 Epoch 494/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7274 - accuracy: 0.8188 - val_loss: 1.6400 - val_accuracy: 0.6961 Epoch 495/500 121/121 [==============================] - 24s 199ms/step - loss: 0.6160 - accuracy: 0.8421 - val_loss: 1.7163 - val_accuracy: 0.7078 Epoch 496/500 121/121 [==============================] - 24s 199ms/step - loss: 0.6794 - accuracy: 0.8307 - val_loss: 1.8185 - val_accuracy: 0.6706 Epoch 497/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6952 - accuracy: 0.8266 - val_loss: 1.6790 - val_accuracy: 0.6647 Epoch 498/500 121/121 [==============================] - 24s 198ms/step - loss: 0.7257 - accuracy: 0.8304 - val_loss: 1.6076 - val_accuracy: 0.6902 Epoch 499/500 121/121 [==============================] - 24s 199ms/step - loss: 0.7075 - accuracy: 0.8182 - val_loss: 1.5847 - val_accuracy: 0.6912 Epoch 500/500 121/121 [==============================] - 24s 198ms/step - loss: 0.6636 - accuracy: 0.8340 - val_loss: 1.8742 - val_accuracy: 0.6725
# extract the training accuracy,loss & validation accuracy, loss to be plotted for the CNN model.
train_acc, train_loss, val_accuracy, val_loss = history.history['accuracy'], history.history['loss'], history.history['val_accuracy'], history.history['val_loss']
# Plot the validation loss with train loss and similar with accuracy for this CNN model
epochs_range = range(len(train_acc))
plt.figure(figsize = [13, 7])
plt.subplot(1, 2, 1)
plt.plot(epochs_range, train_acc, color = 'red', label = 'Training Accuracy')
plt.plot(epochs_range, val_accuracy, color = 'blue', label = 'Validation Accuracy')
plt.legend(loc = 'lower right')
plt.title('Training Accuracy vs Validation Accuracy for CNN')
plt.subplot(1, 2, 2)
plt.plot(epochs_range, train_loss, color = 'orange', label = 'Training Loss')
plt.plot(epochs_range, val_loss, color = 'green', label = 'Validation Loss')
plt.legend(loc = 'upper right')
plt.title('Training Loss vs Validation Loss for CNN')
plt.show()
# Evaluate the accuracy and loss of the trained CNN model on the testing batches
loss, accuracy = cnn_classifier.evaluate(testing_batches)
print('Loss of the trained CNN model: ', loss * 100)
print('Accuracy of the trained CNN model: ', accuracy * 100)
20/20 [==============================] - 1s 65ms/step - loss: 1.8724 - accuracy: 0.6990 Loss of the trained CNN model: 187.2417449951172 Accuracy of the trained CNN model: 69.90196108818054
# Download the MobileNet model and wrap it onto a Keras layer
URL_ = 'https://tfhub.dev/google/tf2-preview/mobilenet_v2/feature_vector/4'
feature_extractor = hub.KerasLayer(URL_, input_shape = (224, 224, 3))
feature_extractor.trainable = False
# TODO: Build and compile the network
tf.keras.backend.clear_session()
model = tf.keras.Sequential([
feature_extractor,
tf.keras.layers.Dense(102, activation = 'softmax')
])
model.compile(optimizer = 'adam',
loss = 'sparse_categorical_crossentropy',
metrics = ['accuracy'])
model.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= keras_layer (KerasLayer) (None, 1280) 2257984 _________________________________________________________________ dense (Dense) (None, 102) 130662 ================================================================= Total params: 2,388,646 Trainable params: 130,662 Non-trainable params: 2,257,984 _________________________________________________________________
# Train the Network
history = model.fit(training_batches,
epochs = 10,
validation_data = validation_batches)
Epoch 1/10 121/121 [==============================] - 131s 1s/step - loss: 2.2590 - accuracy: 0.5253 - val_loss: 1.2470 - val_accuracy: 0.7333 Epoch 2/10 121/121 [==============================] - 127s 1s/step - loss: 0.6822 - accuracy: 0.8784 - val_loss: 0.7587 - val_accuracy: 0.8422 Epoch 3/10 121/121 [==============================] - 126s 1s/step - loss: 0.3996 - accuracy: 0.9345 - val_loss: 0.6131 - val_accuracy: 0.8627 Epoch 4/10 121/121 [==============================] - 126s 1s/step - loss: 0.2721 - accuracy: 0.9629 - val_loss: 0.5298 - val_accuracy: 0.8784 Epoch 5/10 121/121 [==============================] - 126s 1s/step - loss: 0.2006 - accuracy: 0.9753 - val_loss: 0.4791 - val_accuracy: 0.8833 Epoch 6/10 121/121 [==============================] - 127s 1s/step - loss: 0.1546 - accuracy: 0.9841 - val_loss: 0.4442 - val_accuracy: 0.8951 Epoch 7/10 121/121 [==============================] - 128s 1s/step - loss: 0.1217 - accuracy: 0.9912 - val_loss: 0.4299 - val_accuracy: 0.8951 Epoch 8/10 121/121 [==============================] - 126s 1s/step - loss: 0.0980 - accuracy: 0.9932 - val_loss: 0.4162 - val_accuracy: 0.8951 Epoch 9/10 121/121 [==============================] - 127s 1s/step - loss: 0.0807 - accuracy: 0.9963 - val_loss: 0.4152 - val_accuracy: 0.8951 Epoch 10/10 121/121 [==============================] - 129s 1s/step - loss: 0.0678 - accuracy: 0.9971 - val_loss: 0.3837 - val_accuracy: 0.9020
# TODO: Plot the loss and accuracy values achieved during training for the training
# and validation set.
training_accuracy = history.history['accuracy']
validation_accuracy = history.history['val_accuracy']
training_loss = history.history['loss']
validation_loss = history.history['val_loss']
epochs_range=range(len(training_accuracy))
plt.figure(figsize=(10, 6))
plt.subplot(1, 2, 1)
plt.plot(epochs_range, training_accuracy, label='Training Accuracy')
plt.plot(epochs_range, validation_accuracy, label='Validation Accuracy')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')
plt.subplot(1, 2, 2)
plt.plot(epochs_range, training_loss, label='Training Loss')
plt.plot(epochs_range, validation_loss, label='Validation Loss')
plt.legend(loc='upper right')
plt.title('Training and Validation Loss')
plt.show()
It's good practice to test your trained network on test data, images the network has never seen either in training or validation. This will give you a good estimate for the model's performance on completely new images. You should be able to reach around 70% accuracy on the test set if the model has been trained well.
# TODO: Print the loss and accuracy values achieved on the entire test set.
loss, accuracy = model.evaluate(testing_batches)
print('Loss of the model on testing set: ', loss * 100)
print('Accuracy of model on testing set: ', accuracy * 100)
20/20 [==============================] - 20s 1s/step - loss: 0.4420 - accuracy: 0.8755 Loss of the model on testing set: 44.20428276062012 Accuracy of model on testing set: 87.54901885986328
Now that your network is trained, save the model so you can load it later for making inference. In the cell below save your model as a Keras model (i.e. save it as an HDF5 file).
# TODO: Save your trained model as a Keras model.
saved_keras_model_filepath = './Oxford_Flowers102_model_MobileNet.h5'
model.save(saved_keras_model_filepath)
Load the Keras model you saved above.
# TODO: Load the Keras model
reloaded_model = tf.keras.models.load_model(saved_keras_model_filepath,
custom_objects = {'KerasLayer': hub.KerasLayer})
reloaded_model.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= keras_layer (KerasLayer) (None, 1280) 2257984 _________________________________________________________________ dense (Dense) (None, 102) 130662 ================================================================= Total params: 2,388,646 Trainable params: 130,662 Non-trainable params: 2,257,984 _________________________________________________________________
Now you'll write a function that uses your trained network for inference. Write a function called predict that takes an image, a model, and then returns the top $K$ most likely class labels along with the probabilities. The function call should look like:
probs, classes = predict(image_path, model, top_k)
If top_k=5 the output of the predict function should be something like this:
probs, classes = predict(image_path, model, 5)
print(probs)
print(classes)
> [ 0.01558163 0.01541934 0.01452626 0.01443549 0.01407339]
> ['70', '3', '45', '62', '55']
Your predict function should use PIL to load the image from the given image_path. You can use the Image.open function to load the images. The Image.open() function returns an Image object. You can convert this Image object to a NumPy array by using the np.asarray() function.
The predict function will also need to handle pre-processing the input image such that it can be used by your model. We recommend you write a separate function called process_image that performs the pre-processing. You can then call the process_image function from the predict function.
The process_image function should take in an image (in the form of a NumPy array) and return an image in the form of a NumPy array with shape (224, 224, 3).
First, you should convert your image into a TensorFlow Tensor and then resize it to the appropriate size using tf.image.resize.
Second, the pixel values of the input images are typically encoded as integers in the range 0-255, but the model expects the pixel values to be floats in the range 0-1. Therefore, you'll also need to normalize the pixel values.
Finally, convert your image back to a NumPy array using the .numpy() method.
# TODO: Create the process_image function
def process_image(image_array):
image_tensor = tf.convert_to_tensor(np.array(image_array))
image_tensor = tf.cast(image_tensor, tf.float32)
image_tensor /= 255
image_tensor = tf.image.resize(image_tensor, [224, 224])
return image_tensor.numpy()
To check your process_image function we have provided 4 images in the ./test_images/ folder:
The code below loads one of the above images using PIL and plots the original image alongside the image produced by your process_image function. If your process_image function works, the plotted image should be the correct size.
from PIL import Image
image_path = './test_images/hard-leaved_pocket_orchid.jpg'
im = Image.open(image_path)
test_image = np.asarray(im)
processed_test_image = process_image(test_image)
fig, (ax1, ax2) = plt.subplots(figsize=(10,10), ncols=2)
ax1.imshow(test_image)
ax1.set_title('Original Image')
ax2.imshow(processed_test_image)
ax2.set_title('Processed Image')
plt.tight_layout()
plt.show()
Once you can get images in the correct format, it's time to write the predict function for making inference with your model.
Remember, the predict function should take an image, a model, and then returns the top $K$ most likely class labels along with the probabilities. The function call should look like:
probs, classes = predict(image_path, model, top_k)
If top_k=5 the output of the predict function should be something like this:
probs, classes = predict(image_path, model, 5)
print(probs)
print(classes)
> [ 0.01558163 0.01541934 0.01452626 0.01443549 0.01407339]
> ['70', '3', '45', '62', '55']
Your predict function should use PIL to load the image from the given image_path. You can use the Image.open function to load the images. The Image.open() function returns an Image object. You can convert this Image object to a NumPy array by using the np.asarray() function.
Note: The image returned by the process_image function is a NumPy array with shape (224, 224, 3) but the model expects the input images to be of shape (1, 224, 224, 3). This extra dimension represents the batch size. We suggest you use the np.expand_dims() function to add the extra dimension.
# TODO: Create the predict function
def predict(image_path, model, top_k):
from PIL import Image
img = Image.open(image_path)
test_image = np.asarray(img)
processed_image = process_image(test_image)
processed_image = np.expand_dims(processed_image, axis = 0)
reloaded_model = tf.keras.models.load_model(model,
custom_objects = {'KerasLayer': hub.KerasLayer})
img_prob = reloaded_model.predict(processed_image)
return -np.sort(-img_prob)[:, :int(top_k)], np.argsort(-img_prob)[:, :int(top_k)]
It's always good to check the predictions made by your model to make sure they are correct. To check your predictions we have provided 4 images in the ./test_images/ folder:
In the cell below use matplotlib to plot the input image alongside the probabilities for the top 5 classes predicted by your model. Plot the probabilities as a bar graph. The plot should look like this:

You can convert from the class integer labels to actual flower names using class_names.
# TODO: Plot the input image along with the top 5 classes
image_path = './test_images/hard-leaved_pocket_orchid.jpg'
img = Image.open(image_path)
img = np.asarray(img)
num_classes = 5
img_probs, img_classes = predict(image_path,
'./Oxford_Flowers102_model_MobileNet.h5',
num_classes)
classes = [class_names[str(img_classes.squeeze()[i] + 1)] for i in range(num_classes)]
fig, (ax1, ax2) = plt.subplots(figsize = (15, 5), ncols = 2)
ax1.imshow(img)
ax1.axis('off')
ax2.barh(np.arange(num_classes), img_probs.squeeze())
ax2.set_xlim(0, 1.1)
ax2.set_yticks(np.arange(num_classes))
ax2.set_yticklabels(classes)
ax2.set_title('Class Proability Distribution')
Text(0.5, 1.0, 'Class Proability Distribution')
image_path = './test_images/orange_dahlia.jpg'
img = Image.open(image_path)
img = np.asarray(img)
num_classes = 5
img_probs, img_classes = predict(image_path,
'./Oxford_Flowers102_model_MobileNet.h5',
num_classes)
classes = [class_names[str(img_classes.squeeze()[i] + 1)] for i in range(num_classes)]
fig, (ax1, ax2) = plt.subplots(figsize = (15, 5), ncols = 2)
ax1.imshow(img)
ax1.axis('off')
ax2.barh(np.arange(num_classes), img_probs.squeeze())
ax2.set_xlim(0, 1.1)
ax2.set_yticks(np.arange(num_classes))
ax2.set_yticklabels(classes)
ax2.set_title('Class Proability Distribution')
Text(0.5, 1.0, 'Class Proability Distribution')
image_path = './test_images/wild_pansy.jpg'
img = Image.open(image_path)
img = np.asarray(img)
num_classes = 5
img_probs, img_classes = predict(image_path,
'./Oxford_Flowers102_model_MobileNet.h5',
num_classes)
classes = [class_names[str(img_classes.squeeze()[i] + 1)] for i in range(num_classes)]
fig, (ax1, ax2) = plt.subplots(figsize = (15, 5), ncols = 2)
ax1.imshow(img)
ax1.axis('off')
ax2.barh(np.arange(num_classes), img_probs.squeeze())
ax2.set_xlim(0, 1.1)
ax2.set_yticks(np.arange(num_classes))
ax2.set_yticklabels(classes)
ax2.set_title('Class Proability Distribution')
Text(0.5, 1.0, 'Class Proability Distribution')
image_path = './test_images/cautleya_spicata.jpg'
img = Image.open(image_path)
img = np.asarray(img)
num_classes = 5
img_probs, img_classes = predict(image_path,
'./Oxford_Flowers102_model_MobileNet.h5',
num_classes)
classes = [class_names[str(img_classes.squeeze()[i] + 1)] for i in range(num_classes)]
fig, (ax1, ax2) = plt.subplots(figsize = (15, 5), ncols = 2)
ax1.imshow(img)
ax1.axis('off')
ax2.barh(np.arange(num_classes), img_probs.squeeze())
ax2.set_xlim(0, 1.1)
ax2.set_yticks(np.arange(num_classes))
ax2.set_yticklabels(classes)
ax2.set_title('Class Proability Distribution')
Text(0.5, 1.0, 'Class Proability Distribution')